December 12, 2013
On Thu, Dec 12, 2013 at 09:20:38PM +0100, Joseph Rushton Wakeling wrote:
> On 12/12/13 19:52, John Colvin wrote:
> >Delay between people isn't really the problem, it's delay in hearing yourself that's the killer.
> 
> Think people listening to people they hear with delay for their musical cues, and the people they are listening to listening to _them_ for their musical cues, and the feedback effect that might result ... :-)  You have to get used to the fact that the right time to play may sound like the wrong time to play relative to some other group spatially separated from you.
> 
> By the same token, if everyone plays precisely with the conductor, they don't actually play precisely together as far as the audience is concerned, which is why professional orchestras tend to play a bit behind the conductor's beat.

Ahh, so *that's* why they do that!! I've always been wondering why the orchestra always seems to be out-of-beat with the conductor, and why the conductor's beats don't seem to line up with the actual sound.

Thanks!!


T

-- 
If it's green, it's biology, If it stinks, it's chemistry, If it has numbers it's math, If it doesn't work, it's technology.
December 12, 2013
On Thursday, 12 December 2013 at 10:43:24 UTC, Manu wrote:
> Are there any music game nerds hanging around here who would be interested
> in joining a side project like this?

I'm... interested. That's about all I can commit to at the moment :-)

As for skillset... well I've been doing gamedev commercially for about 5 years now (last 4 years on console/PC) and I generally do a mix of high-level game logic, and core systems. I could quite happily do graphics, physics, as well. Not massively experienced with audio unfortunately. I do play guitar though :-D

Please keep me informed.
December 12, 2013
On 12/12/13 22:13, H. S. Teoh wrote:
> Ahh, so *that's* why they do that!! I've always been wondering why the
> orchestra always seems to be out-of-beat with the conductor, and why the
> conductor's beats don't seem to line up with the actual sound.

There's quite a nice blog post describing some of the reasons behind this here, if you're interested:
http://blog.davidhthomas.net/2006/12/but-im-with-the-conductor/

December 13, 2013
On 13 December 2013 03:43, Joseph Rushton Wakeling < joseph.wakeling@webdrake.net> wrote:

> On 12/12/13 16:47, Jacob Carlborg wrote:
>
>> Instead I had to time the screen to get any points.
>>
>
> Not defending Guitar Hero here, but sometimes it is necessary to follow visual rather than sonic cues in performance -- e.g. the brass players and others at the back of a symphony orchestra will often play ahead of musicians at the very front, because the sound takes longer to get from them to the audience.  There's a lot of subtle internal stuff that goes on with different sections of the orchestra having to react and play differently in order to keep the whole together, and a lot of that needs to be be modulated by following visual cues of one kind or another from various different people, sometimes against the grain of what your ears are getting.
>

Hey that's a really interesting thought actually. I never considered that.
I suppose this would only be applicable in a large auditorium though?
I have a whole new appreciation for the conductor upon that revelation
alone! :)
Modern music has a conveniently placed foldback speaker, or even in-ear
monitor... you all hear each other at the same time, it's easy enough to
synchronise.

Then there are things like some extreme contemporary music where different
> musicians are effectively in different tempi -- you can play with a click-track, but sometimes it's easier or preferable to have flashing lights give you your own personal tempo.
>
> Plugged-in performance isn't really my area, but it wouldn't surprise me if having to deal with latency is an occasional occupational challenge there -- can anyone confirm? :-)
>

Plugged in performance still uses low-tech analog technology, specifically
to eliminate latency. Modern recording hardware has an element of wrangling
latency, although that was more of a problem 5-10 years ago than it is now.
At all comes from the switch to digital technology, where you need to start
buffering blocks of audio, and performing burst transmissions. That's where
latency really starts creeping into the system.
Certainly prevalent in GH, which is running on cheap consumer hardware,
which isn't really even designed for true low-latency gameplay (not to the
standards a good musician expects).
The work-around is a bunch of in-game knobs to specify latency offsets for
various outputs (video/audio), against the input
(controllers/keyboard/midi) which may also have its own latency.
People need to properly understand the problem before they can configure
these knobs properly, and most general players just don't even bother, they
play right off the screen and ignore the music.


December 13, 2013
On 13 December 2013 04:48, John Colvin <john.loughran.colvin@gmail.com>wrote:

> On Thursday, 12 December 2013 at 17:04:32 UTC, Manu wrote:
>
>> On 13 December 2013 02:28, John Colvin <john.loughran.colvin@gmail.com
>> >wrote:
>>
>>  On Thursday, 12 December 2013 at 15:47:48 UTC, Jacob Carlborg wrote:
>>>
>>>  On 2013-12-12 11:43, Manu wrote:
>>>>
>>>>  So, I'm a massive fan of music games. I'll shamefully admit that I was
>>>>> tragically addicted to Dance Dance Revolution about 10 years ago. Recently, it's Guitar Hero and Rock Band.
>>>>>
>>>>> I quite like the band ensemble games, they're good party games, and great rhythm practise that's actually applicable to real instrument skills too.
>>>>>
>>>>>
>>>> I wouldn't agree with that, at least not for Guitar Hero. I had issues
>>>> with the timing. When I played I tried to time the music, but that
>>>> didn't
>>>> work. Instead I had to time the screen to get any points.
>>>>
>>>>
>>> As a lifelong musician this annoyed the hell out of me. The whole
>>> experience ends up like reading annoying flashy sheet music. Low latency
>>> is
>>> critical.
>>>
>>>
>> It is quite sad how few people seem to know how, or make the effort to setup their system properly, or know what the typical sources of latency are... or even that there's such a thing as latency in the first place. It's possible to make it near-perfect... but when I go to other peoples houses and try to play, it never is. And then I start fiddling with their AV system for an hour while everyone get's angry at me and tells me it's perfect, and have no idea what I'm fussing about >_<
>>
>
> You just described my life :p
>

Hey, I hear you man! ;)


December 13, 2013
On 13 December 2013 04:52, John Colvin <john.loughran.colvin@gmail.com>wrote:

> On Thursday, 12 December 2013 at 18:31:58 UTC, Joseph Rushton Wakeling wrote:
>
>> On 12/12/13 19:15, Iain Buclaw wrote:
>>
>>> You know, I've never had that... but then again I haven't had the fortune of being in a band where distance between the first and back musicians is > 200 metres.  (Because sound doesn't travel *that* slow ;)
>>>
>>
>> Well, it's not _just_ about the speed of sound, there are also things like the speed of attack of different instruments and so on.
>>
>> Then again, ever been to a performance of one of those pieces that ask
>> for some musicians to be placed in different locations round the back of
>> the concert hall for spatial effects?  Things can get fun with that ... :-)
>>
>>
>>  Only in the recording studio - if the time it takes for sound to leave
>>> your instrument, into the microphone, through the walls into the studio booth, into the mixer (and assuming digital) from the mixer to the sound card, to the DAW software mixer which is taking the recording and mixing it in with the playing tracks (optional live effects processing being done) back to the sound card, to the mixer, through the walls into the studio room, into the headphones of the receiver playing the instrument...  is greater than 22ms, then the person playing experiences a delay in the time he plays to the time he hears himself in the song.  If that happens, you are not in a good situation. =)
>>>
>>
>> So, if your latency is 22ms, think of how that corresponds to sound travelling in space: you only need to be separated by about 7.5m for that kind of delay to kick in.
>>
>
> Delay between people isn't really the problem, it's delay in hearing yourself that's the killer. Although 22ms is the normally quoted limit for noticing the latency, it actually depends on frequency. Even regardless of frequency, i typically find that anything less than 64ms is ok, less than 128ms is just about bearable and anything more is a serious problem for recording a tight-sounding performance.
>

Latency between recording musicians has a strange effect of gradually
slowing the tempo down. Ie, if both musicians are playing with headphone
monitors or something, and there is a small latency in the system.
If you are playing together, but then you feel a 20ms latency between you
and the other musician, you tend to perceive yourself as playing slightly
too fast, and then adjust by slowing a fraction, the same thing happens in
the other direction, so you're both constantly slowing by a fraction to
maintain perception of synchronisation, and the tempo gradually slows.
It's almost an unconscious psychological response, quite hard to control in
the studio.

Man, my day job works in quantities of 16ms (1 frame), and I have spent
many hours resolving inter-frame synchronisation issues (16ms out of
synch). Maybe I'm just hyper-sensitive, but 64ms is extremely noticeable to
me. 128ms is like an eternity!
Consider, 16th notes at 120bpm (not unusual in metal, I assure you), are
only 125ms apart, that more than an entire note out.
Around 4ms is what professional recording setups aim for.


December 13, 2013
On 13 December 2013 06:42, Rémy Mouëza <remy.moueza@gmail.com> wrote:

> If, when writting "mini and communication processing", you meant MIDI
> (Musical Instrument Digital Interface) instead of mini, you may be
> interesting by my bindings to the RtMidi library:
>  - https://github.com/remy-j-a-moueza/drtmidi
>  - RtMidi website: http://www.music.mcgill.ca/~gary/rtmidi/index.html


I did. Very handy!


On 12/12/2013 11:43 AM, Manu wrote:
>
>> So, I'm a massive fan of music games. I'll shamefully admit that I was tragically addicted to Dance Dance Revolution about 10 years ago. Recently, it's Guitar Hero and Rock Band.
>>
>> I quite like the band ensemble games, they're good party games, and great rhythm practise that's actually applicable to real instrument skills too.
>>
>> The problem is though, that Neversoft and Harmonix completely fucked up the GH and RB franchises. Licensing problems, fragmented tracklists. It's annoying that all the songs you want to play are spread across literally 10 or so different games, and you need to constantly change disc's if you want to play the songs you like.
>>
>> I've been meaning to kick off a guitar hero clone since GH2 came out. I started one years ago as a fork of my Guitar Hero song editor for PS2, and I added support for drums before GH4 or RB were conceived, but then when they announced those games they stole my thunder and it went into hibernation.
>>
>> I'm very keen to resurrect the project (well, start a new one, with
>> clean code, in D).
>> Are there any music game nerds hanging around here who would be
>> interested in joining a side project like this? It's a lot more
>> motivating, and much more fun to work in a small team.
>>
>> It's an interesting union of skills; rendering, audio processing, super-low-latency synchronisation, mini and communications processing, animation, UI and presentation.
>>
>> I have done all this stuff commercially, so I can act as a sort of project lead of people are interested, but haven't tried to write that sort of software before.
>>
>> It also seems like a good excuse to kick off a fairly large scale and performance intensive D project, which I like to do from time to time.
>>
>
>


December 13, 2013
Have you looked at Rocksmith? If I were going to spend a lot time doing a guitar/music game - I would probably look in that direction. As far as I know most audio professionals consider lantencies under 20ms acceptable but preferably smaller. from what I understand the brain itself has a latency of about 12ms.

On Fri, Dec 13, 2013 at 1:18 PM, Manu <turkeyman@gmail.com> wrote:
> On 13 December 2013 06:42, Rémy Mouëza <remy.moueza@gmail.com> wrote:
>>
>> If, when writting "mini and communication processing", you meant MIDI
>> (Musical Instrument Digital Interface) instead of mini, you may be
>> interesting by my bindings to the RtMidi library:
>>  - https://github.com/remy-j-a-moueza/drtmidi
>>  - RtMidi website: http://www.music.mcgill.ca/~gary/rtmidi/index.html
>
>
> I did. Very handy!
>
>
>> On 12/12/2013 11:43 AM, Manu wrote:
>>>
>>> So, I'm a massive fan of music games. I'll shamefully admit that I was tragically addicted to Dance Dance Revolution about 10 years ago. Recently, it's Guitar Hero and Rock Band.
>>>
>>> I quite like the band ensemble games, they're good party games, and great rhythm practise that's actually applicable to real instrument skills too.
>>>
>>> The problem is though, that Neversoft and Harmonix completely fucked up the GH and RB franchises. Licensing problems, fragmented tracklists. It's annoying that all the songs you want to play are spread across literally 10 or so different games, and you need to constantly change disc's if you want to play the songs you like.
>>>
>>> I've been meaning to kick off a guitar hero clone since GH2 came out. I started one years ago as a fork of my Guitar Hero song editor for PS2, and I added support for drums before GH4 or RB were conceived, but then when they announced those games they stole my thunder and it went into hibernation.
>>>
>>> I'm very keen to resurrect the project (well, start a new one, with
>>> clean code, in D).
>>> Are there any music game nerds hanging around here who would be
>>> interested in joining a side project like this? It's a lot more
>>> motivating, and much more fun to work in a small team.
>>>
>>> It's an interesting union of skills; rendering, audio processing, super-low-latency synchronisation, mini and communications processing, animation, UI and presentation.
>>>
>>> I have done all this stuff commercially, so I can act as a sort of project lead of people are interested, but haven't tried to write that sort of software before.
>>>
>>> It also seems like a good excuse to kick off a fairly large scale and performance intensive D project, which I like to do from time to time.
>>
>>
>

December 13, 2013
On 13 December 2013 14:07, Danni Coy <danni.coy@gmail.com> wrote:

> Have you looked at Rocksmith?


I have, and I definitely want to work that in... but game-ify it a bit more. That's substantially more work though, so I think GH is a simpler first target which will get UI and presentation out of the way, encourage more users+contributors, and should leave a good framework for any tightly synchronised gameplay.

It would be a very interesting piece of work though for someone awesome at
signal analysis, to take an input audio stream and effectively break it
into midi trigger information.
Given that, it's easy to make a game on top.

If I were going to spend a lot time
> doing a guitar/music game - I would probably look in that direction. As far as I know most audio professionals consider lantencies under 20ms acceptable but preferably smaller. from what I understand the brain itself has a latency of about 12ms.
>
> On Fri, Dec 13, 2013 at 1:18 PM, Manu <turkeyman@gmail.com> wrote:
> > On 13 December 2013 06:42, Rémy Mouëza <remy.moueza@gmail.com> wrote:
> >>
> >> If, when writting "mini and communication processing", you meant MIDI
> >> (Musical Instrument Digital Interface) instead of mini, you may be
> >> interesting by my bindings to the RtMidi library:
> >>  - https://github.com/remy-j-a-moueza/drtmidi
> >>  - RtMidi website: http://www.music.mcgill.ca/~gary/rtmidi/index.html
> >
> >
> > I did. Very handy!
> >
> >
> >> On 12/12/2013 11:43 AM, Manu wrote:
> >>>
> >>> So, I'm a massive fan of music games. I'll shamefully admit that I was tragically addicted to Dance Dance Revolution about 10 years ago. Recently, it's Guitar Hero and Rock Band.
> >>>
> >>> I quite like the band ensemble games, they're good party games, and great rhythm practise that's actually applicable to real instrument skills too.
> >>>
> >>> The problem is though, that Neversoft and Harmonix completely fucked up the GH and RB franchises. Licensing problems, fragmented tracklists. It's annoying that all the songs you want to play are spread across literally 10 or so different games, and you need to constantly change disc's if you want to play the songs you like.
> >>>
> >>> I've been meaning to kick off a guitar hero clone since GH2 came out. I started one years ago as a fork of my Guitar Hero song editor for PS2, and I added support for drums before GH4 or RB were conceived, but then when they announced those games they stole my thunder and it went into hibernation.
> >>>
> >>> I'm very keen to resurrect the project (well, start a new one, with
> >>> clean code, in D).
> >>> Are there any music game nerds hanging around here who would be
> >>> interested in joining a side project like this? It's a lot more
> >>> motivating, and much more fun to work in a small team.
> >>>
> >>> It's an interesting union of skills; rendering, audio processing, super-low-latency synchronisation, mini and communications processing, animation, UI and presentation.
> >>>
> >>> I have done all this stuff commercially, so I can act as a sort of project lead of people are interested, but haven't tried to write that sort of software before.
> >>>
> >>> It also seems like a good excuse to kick off a fairly large scale and performance intensive D project, which I like to do from time to time.
> >>
> >>
> >
>
>


December 13, 2013
On Thu, Dec 12, 2013 at 10:47:36PM +0100, Joseph Rushton Wakeling wrote:
> On 12/12/13 22:13, H. S. Teoh wrote:
> >Ahh, so *that's* why they do that!! I've always been wondering why the orchestra always seems to be out-of-beat with the conductor, and why the conductor's beats don't seem to line up with the actual sound.
> 
> There's quite a nice blog post describing some of the reasons behind this here, if you're interested: http://blog.davidhthomas.net/2006/12/but-im-with-the-conductor/

Interesting! Makes total sense, though. You're dealing with 100+ human players, and keeping them all in sync is quite challenging. It's totally different if you're playing a computer-conducted orchestra, where everything is mechanically kept in top-notch sync. :P  For some reason, that doesn't sound as good as a live orchestra.

(I read somewhere that it is due to our brains automatically filtering out repetitive stimuli.  The precise timing of computer-generated music produces an exact, mechanical rhythm, which causes the brain to tune out, resulting the perception of dullness or tiredness. Human players, OTOH, are always ever so slightly off beat, and the slight variations keep the brain interested and not tune out. Same thing applies to the precise attack velocities of computer-generated notes -- after a while it feels tiring because it's exactly the same velocity over and over. Human players produce quite a wide variety of attack velocities, even when playing the same notes over and over, which makes it far more interesting to listen to. Inexactness isn't always a bad thing!)


T

-- 
Study gravitation, it's a field with a lot of potential.