So why do we need a keyboard? Is the feeling of something moving beneath your fingers something that people will always need or is it just what we know, so it’s harder to move away from it?
Will designers in the future begin designing textures and variable temperatures of the screens to provide more feedback?
This visually lo-fi iPhone game employs some of the affordances of a touch screen to a traditional platform, warrior’s quest type scenario.
The first thing I see is the cinematic panning effects used to explore the player’s field of view. I’m assuming the character is directed purely with the players finger.
At the 6:00 minute mark you’ll see the player actually moving into the world with the graphics appearing and scrolling to provide the illusion.
I interviewed Dr. Randolph Foy from the music department at State. Below is a crystallized flow of the notes I took.
Reading music and playing a stringed instrument is one of the most complex processes that humans perform.
There are two aspects involved in reading music.
1) deciphering notes
2) transferring this to physical actions
Suzuki, for example, teaches students to play BY EAR eliminating the entire complex aspect of reading music. Playing by ear only tackles one aspect of playing music, the physical actionable elements.
Counting, (1, &, 2 &) is a function of the analytical side of the brain. It’s an analytical problem.
When playing the clarinet you aren’t computing when and how to bow but “when can I take a breath”. It’s solving different physical problems.
Gymnasts and figure skaters also solve physical problems by calculating spins and movements.
Computation problem + Expressive problems
(ex. a jazz musician who has to determine what they can do in a certain amount of time to resolve their improvisation. The best improvisation is computation and emotionally rigorous.)
The above ideas are all separate from the tasks of decoding notes and symbols.
Lot’s of cultures do not have written music yet still make complex arrangements and beats. Learning music is for much of history been an oral tradition. You learned to listen hard and that was how you learned to play.
Playing from notated music is an entirely different “role”. When your reading music your fulfilling someone else’s intent and not your own.
All western notation is symbolic and not graphic. (Key)
There are two parameters to music.
TIME: (all music exists in time necessarily. it unfolds)
SOUND: the sound is also mediated through time. they are interconnected. Sound is connected to pitch which is easily discernible to the human ear. If a sound is off maybe a thousand of a vibration of a second the human ear can detect it and say “that isn’t right”.
Within this symbolic system, we indicate pitch and temporal characteristics through symbols. However, temporal features of the symbols are dependent on so many factors. The tempo of the conductor, or how fast others are playing, the descriptions written by the composer, and most importantly, the way the symbol looks.
Players reference the time and key signatures to decode the notes. It allows you to decode the symbols with everyone else so that you’re all playing the same thing.
[parallel case example: viola players when switching clefts will write the note names (a, b, c) above the symbols. This way it helps them decode because they may know that a "d" means "third finger" in the physical realm.]
How Players Fail
You can fail by mis-reading the notes OR you can technically fail by just not knowing where to play. You can fail in dexterity and not in comprehension. However, there’s no real way to know where a player is failing. How do you verify that players understand the notation and aren’t just failing at technical aspects?
Maybe I can focus on learning pitches without rhythms or visa versa. Is there a way to teach proper decoding of rhythm notation and physical time lengths?
example: Clap exercises verify that players understand rhythm notation but doesn’t tackle the problem of finding correct pitches.
• Changes the playing of a tune according to the time signature.
• Revises rhythms based on a new time signatures.
• Habitually count out the beats while site reading.
• Proceeds through a piece of music by site reading.
• Maintains motivation to continue playing over long a long period of time
• Empower individuals in learning to play an instrument
• Motivate users to continue practicing this difficult task
• Build users understanding of music theory through verbal, visual, and auditory means
• Allow users to accurately side read pieces of music
• Give users feedback about how to improve their technique
Outcomes: By using this application you should…
[Adapts skill sets to meet a problem situation]
• Change the duration of the bows stroke in accordance with the note characteristics
• Revise mental rhythms according to the time signature.
• Habitually count out the beats while site reading.
• Proceed through a piece of music by site reading.
• Maintain motivation to practice
[Sense cues that guide motor activity]
• Detect when you are playing from “feel” instead of conscious internal rhythm.
• Detect when it is a appropriate when to play from a “feel”
[Attach value or worth]
• express satisfaction over seeing your progression
• be aware of negative habits and tendencies while playing
How can I design an application for users who will most likely have their minds and bodies pre-occupied with playing an instrument? How do you manage the handling of a mobile device with learners need to move and have their hands free while doing an activity?
I’d like to create an application that helps people count rhythms. Sometimes while playing my instrument its hard to break into the verbal portion of my brain which is where counting takes place. My concern is that I don’t want to make an alternative method of counting rhythms that people are dependent on, but a visual system that aids as training wheels for verbal counting. How do you design an application that makes itself irrelevant the more people use it?
In your presentation you spoke about “disruptive innovation” and looking ahead 3-5 years. When developing ideas, how exactly do you look ahead into the future? Do your ideas reference current technologies? Why not look ahead 15-20 years? When does forward thinking become fantasy?
People who are having trouble counting in their heads in accordance with specific note values in the music. It’s a clash of the left and right side of the brain and not all people can think about words while in the music “zone”.
Also, I’d like this app to motivate people to progress and not quite. I think this could be accomplished by introducing a competitive aspect.
The first goal would be targeted to both older individuals and younger, while the latter competitive would be mainly targeted at younger students.
This is an visual system that functions as the verbal counting mechanism that many individuals learn “one and two and three and four and one and…etc). I need to think about how to translate words into images or movements that function to help keep track of note values as effectively as words.
Also, I’d like to have an evolving system that develops musically as you do and you compete against this “character” to out-perform it by playing the fastest, most in tune, clearest, loudest, most accurately, slowest etc..
The concept could be embedded in the music stand, or the music itself. Also the instrument could be augmented in some way.
When: The use would come while practicing at home away from a lesson or at an orchestra rehearsal when you have to be able to site read with the other players. A big challenge is remembering what you learned at your lesson and then applying those principals to your practice at home.
Learning an instrument is difficult. Most of the challenge is staying motivated to practice even when the going is slow. Many times students quite after a few months or even a couple of years, on the cusp of understanding. People quite just before they start to sound good on their instrument.
As said before, I want this to be visual and not verbal because when I’m playing it’s very hard for me to talk or read things.
Either this is a visual map that helps you get through the music or a device that records your progress and puts you head-to-head with another player.
I think this device will combine the motivation aspect with the counting aspect by tracking how well users play the rhythms, and then overlaying visuals of your playing onto ideal syncopation of the measure of music.
Cleartune is a chromatic instrument tuner and pitch pipe that allows you to quickly and accurately tune up using the built-in mic in your iPhone.
Cleartune features a unique “note wheel” interface allowing you to quickly find your pitch, paired with a highly responsive fine-tuning meter for the perfect tune.
While thinking about how and why people learn to play instruments I came across this project— an interactive music table. While it’s not exactly mobile, the interactive table affords musical exploration while eliminating some of the technical skills required to produce notes on an instrument. Many digital music programs also allow musical exploration sans-technical skills but this puts tactility back in the mix. I like to imagine how devices like this could be used to create a wave of next-gen electronic music where the musicians physical movements and fine motor skills could be re-introduced into the note producing process.
I’m a bit skeptical of constant entertainment and I appreciate how this course isn’t focusing on making a killer app, but trying to find out how to use mobile devices to advance fluid ways of learning. Here is an example of new mobile experience that already seems outdated: Mobile TV.
“Just about 1 percent of mobile users in the U.S. watch mobile TV.”
Learning or speaking another language
conducting design research
learning to read music
making electronic music
setting up and designing a website
growing plants or pruning grapevines
using the shop
crafting characters and a good story