Now when I wanted to add touchscreen support to the synthesizer application, I initially started with touch events. There were two challenges, though. First, you’ll notice that on the desktop, if I don’t have that touch event emulation turned on, it doesn’t actually work at all when I use the mouse. [NOISE] Of course, this is expected. But I didn’t really want two code paths, one for mouse and one for touch. Secondly, I really wanted to be able to have a drag across the keyboard. Play each note as you slid across it in turn. But with the way touch delivers all the events to the originally touched down element, I would have had to calculate the hit testing myself for these. And that was kind of a pain. I really wanted these touch events to be delivered to the down element for each key. Instead what happens in this synth, is when I hit one key [SOUND] and then drag, it doesn’t actually move where the events are delivered.