Case Study: Moodles
Our Story
In 2016, Classical KUSC began searching for new opportunities to inspire youths to engage with classical music. After a few brainstorming sessions, two ideas stood out; the first was to produce a daytime event at the Los Angeles Natural History Museum called Discovery Day. The second was to launch an iOS app called Moodles. Moodles is a first-of-its-kind musical instrument app for children and teenagers to compose music without the banality of classes. The goal is to introduce music composition in a fun way that does not require lessons music theory. The last thing the team wanted to do was build an iPad piano app so that your mom can tell you to practice!
The story below describes how the team approached developing a new product that includes animation, sound, music quizzes, and real-time song collaboration.
Research
Market Sizing
Our initial research on "kids music apps" revealed that there aren't any music apps for kids aged 13 to 18. Therefore, we decided to make something that could be fun, collaborative and easy to integrate with Facebook.
With these basic features in mind, the idea later evolved into an app with two distinct goals: enabled kids to both compose a song and learn classical music through quizzes. The quiz followed a 'Simon Says' premise that taught children to follow along using the touch screen.
Partnership
We worked in partnership with USC's classical music radio station KUSC. To expand into the youth market, the radio station believed products catering to children had untapped potential. KUSC's audience belongs to an older generation. Moodles was an accessible way to bridge the gap between them and their children or grandchildren. Our goal was to assess children's interest in creating and sharing music in the context of a radio station.
Licensing
The biggest challenge to developing music apps is overcoming licensing rights. At the time, the team had no access to music publishing or streaming rights; therefore, an instrument app that enables users to composer their music was the best solution forward.
Brand Development
Product Concept
Moodles is an audiovisual instrument that allows the musician to see animation as it begins playing sound.
Key Features:
- It's an instrument that enables kids to become song composers and share.
- Users share compositions via email or Facebook post.
- This allows you to create musical doodles and share them with your friends.
- Available for both iPad and iPhone
- A quiz that helps kids learn Beethoven's 5th through the game of Simon.
Mission Statement
Moodles is an audiovisual instrument that empowers kids to make, record, and share music in a fun, interactive way.
Although this mission statement is simple by design, it took 25 tries. At one point, the team was even able to create a mad libs-style mission statement builder.
Project X is a [instrument app]
that [helps, enlightens, teaches, encourages, promotes, empowers]
[student musicians, baby Mozarts, mini musicians, kids, students, visual musicians, second screen toddlers, whatever feels right]
through a [ screen, iPhone, iPad, mobile app, tv app, visualizer, software app ].
Essentially, it is a concise way of presenting a. who you are (or in this case, the app), b. what it does, c. who is it for, d. through which platform, and e. what is it supported by.
Vision Statement
Our vision statement is the mission we set out to achieve. Learning how to play an instrument is hard and often discouraging –- especially when kids don't have the right equipment, training, and encouragement.
Moodles helps to make music fun by creating a new kind of product category, an audiovisual instrument. Moodles solves the challenges of music theory, sight-reading, musical pedagogy, MIDI, and sound recording by creating an app that is fun to play, easy to record, and simple to share with friends.
Unlike professional music apps like Korg Module, Korg Gadget, Cubasis, and Garage Band, kids don't need to take classes to compose an excellent sounding song that helps them express themselves.
Value Proposition
The value proposition is what separates us from the rest. It makes us unique. Unlike professional music apps like Sibelius, Logic, Pro Tools, and Garageband, no classes (or video tutorials) were needed in Moodles to compose a musical and expressive song. Our most ambitious effort lied in creating original sounds to match our custom animations. It helped to overcome the obstacle that licensing rights presented. Instead of employing royalty-free music, we hired sound engineer Tim Yang to build a sound library befitting our custom animations.
We're creating a new generation of musicians to push the boundaries of music engagement through touch screen interfaces, simple recording techniques, and social sharing. This new paradigm in peer-to-peer songwriting and co-composition will shape how people worldwide will make music through wireless broadband technology.
Naming a Product
The team completed a three-round naming process for naming. In the end, we choose Moodles as a hybrid name for "Musical Doodles."
Design
The design process consists of three main parts; a mood board, wireframes, and final UIX comps. Art direction and design was led by Grant Kindrick who oftentimes, can even make a mood board look like a work of art.
Mood board
Typography
Color Palette
Abstract Shapes
Logo
UIX
Below are the actual design compositions that were produced using Sketch. The app itself was designed for both iPhone and iPad devices.
This example shows what the iOS comps looked like in Sketch.
Development
Developing Custom Animations
From the onset, the team wanted to provide custom animations for the music application. We worked with Swedish designer Karolin Gu to create a suite of new animations using Adobe After effects.
From there, we worked with an After Effects Plug-in called Squall to export the animation into Apple Swift.
Moodles Animations
View animations on YouTube.
This video packages all the animation together into a single user experience.
Developing Custom Sounds
We hired sound engineer Tim Yang to create a custom sound library to match Karolin's animations. Tim is brilliant and produced a sound palette in a few days allowing the engineers time to learn how to package them into a SoundFont. SoundFont is a brand name that consists of both a file format and a technology that uses sample-based synthesis to play MIDI files.
Polyphone Soundfont enabled us to package Tim's WAV files into a single sf2
library. If you would like to hear the sounds, you can download the sounds, royalty-free, from the Internet Archive.
Working with MIDI
AudioKit Pro makes it easy to build iOS applications that use MIDI and Apple's CoreAudio. Therefore, our first step towards building an instrument app was to map musical notes to our MIDI piano. The diagram below shows how the team planned each note to MIDI JSON.
Here is a sample of the Swift code. The class makes it possible to bridge the musical notes (i.e., pitch and rhythm) a user creates when tapping on a screen to the MIDI that we can later record, playback, and store a composition into the cloud.
import Foundation
//Structure related to Music JSON format
enum MIDIEventType: String {
case note = "note"
case param = "param"
case control = "control"
case pitch = "pitch"
case chord = "chord"
case sequence = "sequence"
}
protocol MIDIComposition {
associatedtype TNoteEvent: MIDINoteEvent
var noteEvents:[TNoteEvent] { get }
var length: Double { get }
var tempo: Double { get }
}
protocol MIDIEvent {
var time: Double { get } // describe time in beats
var type: String? { get } // one of MIDIEventType
}
protocol MIDINoteEvent: MIDIEvent {
var duration: Double { get } // describe time in beats
var number: Int16 { get } // INT [0-127], represents the pitch of a note
var velocity: Int16 { get } // From 0-127
}
Bridging Music + JSON + Swift
We were inspired by the work of Andrew Madsen who showed us a way to bridge Music + JSON + Objective-C.
Once we were able to bridge music and JSON and provide a systematic way to record and playback any song, we then began developing a musical quiz.
MIKMIDINoteEvent
MIKMIDINoteEvent(timeStamp: 0, note: 48, velocity: 127, duration: 0.5, channel: 0)
JSON
[3, "note", 50, 127, 0.5]
Bridging these two concepts enabled us to record when a user taps the [UICollectionViewCell
](uicollectionview swift) and covert it to JSON for data storage.
{
"name": "My First Song",
"events": [
//User Tap #1
[2, "note", 48, 127, 0.5],
// User Tap #2
[2.5, "note", 49, 127, 0.5],
// timeStamp, note, velocity, duration
[3, "note", 50, 127, 0.5],
[3.5, "note", 51, 127, 3.5],
[10, "note", 52, 127, 0.5]
]
}
Real Example
Here is an example of a user-created composition:
{
"length": 33.733943939208984,
"name": "Mary",
"tempo": 60,
"events": [
[1.735548973083496, "note", 86, 127, 2.0011720657348633],
[2.200732946395874, "note", 82, 127, 2.0007500648498535],
[2.666996955871582, "note", 80, 127, 1.5799760818481445],
[3.134163022041321, "note", 78, 127, 2.001255989074707],
[4.6933770179748535, "note", 84, 127, 0.4515969753265381],
[3.976855993270874, "note", 74, 127, 2.0013020038604736],
[4.24698793888092, "note", 80, 127, 2.0012450218200684],
[4.441772937774658, "note", 79, 127, 2.0023610591888428],
[4.9407700300216675, "note", 83, 127, 2.0013009309768677],
[5.144986987113953, "note", 84, 127, 2.0023430585861206],
[5.47352397441864, "note", 86, 127, 2.001214027404785],
[7.661324977874756, "note", 84, 127, 1.3822569847106934],
[7.083531022071838, "note", 82, 127, 2.0057259798049927],
[7.427384972572327, "note", 78, 127, 2.0109879970550537],
[7.8482489585876465, "note", 74, 127, 2.0053189992904663],
[8.159757018089294, "note", 80, 127, 2.008544921875],
[8.585530042648315, "note", 83, 127, 2.001150965690613],
[9.043583035469055, "note", 84, 127, 2.006042003631592],
[9.545789003372192, "note", 87, 127, 2.0242420434951782],
[10.068104982376099, "note", 86, 127, 2.001162052154541],
[11.023968935012817, "note", 90, 127, 2.0013060569763184],
[12.052382946014404, "note", 91, 127, 2.03545606136322],
[13.504536986351013, "note", 85, 127, 2.000174045562744],
[13.735672950744629, "note", 88, 127, 2.0012930631637573],
[13.969988942146301, "note", 89, 127, 2.001507043838501],
[14.230641961097717, "note", 84, 127, 2.000688076019287],
[14.501525044441223, "note", 82, 127, 2.0002689361572266],
[16.187057971954346, "note", 85, 127, 0.9669140577316284],
[17.41103994846344, "note", 88, 127, 1.2865140438079834],
[17.153972029685974, "note", 85, 127, 2.0012149810791016],
[17.669700980186462, "note", 89, 127, 2.032960057258606],
[17.962610006332397, "note", 84, 127, 2.023123025894165],
[18.2659410238266, "note", 86, 127, 2.0036189556121826],
[18.69755494594574, "note", 88, 127, 2.0013450384140015],
[18.985236048698425, "note", 82, 127, 2.0052649974823],
[19.462666034698486, "note", 80, 127, 2.029099941253662],
[19.757699966430664, "note", 81, 127, 2.0025429725646973],
[20.59239101409912, "note", 90, 127, 2.000859022140503],
[21.264801025390625, "note", 89, 127, 2.0012810230255127],
[22.036635994911194, "note", 93, 127, 2.00110399723053],
[23.480831027030945, "note", 95, 127, 1.3238129615783691],
[24.80464494228363, "note", 95, 127, 2.005560040473938],
[26.1202529668808, "note", 94, 127, 1.3467869758605957],
[27.467041015625, "note", 94, 127, 2.0007799863815308],
[28.701019048690796, "note", 78, 127, 2.0011409521102905],
[29.587227940559387, "note", 72, 127, 2.001275062561035],
[30.413745045661926, "note", 73, 127, 2.0694299936294556],
[31.169628024101257, "note", 74, 127, 2.016849994659424],
[31.938958048820496, "note", 72, 127, 1.7940999269485474],
[32.785423040390015, "note", 91, 127, 0.9476529359817505],
[32.18381094932556, "note", 77, 127, 1.549280047416687],
[32.37651801109314, "note", 81, 127, 1.3565880060195923],
[32.48401200771332, "note", 82, 127, 1.2491090297698975],
[32.115553975105286, "note", 76, 127, 1.6175800561904907]
]
}
Making of a Musical Quiz
To develop a game of Simon, the team developed a process for generating MIDI files and converting them into MIDI JSON. Below is an example of what our MIDI files looked like. The compositions were edited in Pro Tools and then exported to MIDI, for further processing into MIDI JSON. MIDI JSON is the modern equivalent of MIDI XML, which is a better format for storing data in the cloud.
The quiz consists of the melody for Beethoven's 5th symphony chopped up into 14 sections, which we later structured into rounds.
Round #1 is very recognizable. The screenshot shows the start of Beethoven's famous "da da da dum".
This example shows what it’s like for a user to tap three notes at the same time.
Once the MIDI files were imported into the iOS app, the data was parsed for MIDI playback. The class below shows a sample of what we had to do to parse a MIDI file into JSON dynamically.
import Foundation
import AudioKit
import MagicalRecord
import PromiseKit
func parse(fileUrl: URL) -> Promise<MDComposition> {
return Promise(resolvers: {fullfill, reject in
let name = fileUrl.lastPathComponent.replacingOccurrences(of: "." + fileUrl.pathExtension, with: "")
var length: Double = 0
var tempo: Double = 0
var events = [MDNoteEvent]()
AKSequencer(urlPath: fileUrl as NSURL).tracks.forEach { track in
let trackLength = Double(track.length)
if trackLength > length {
length = trackLength
}
var iterator: MusicEventIterator? = nil
NewMusicEventIterator(track.internalMusicTrack!, &iterator)
var eventTime = MusicTimeStamp(0)
var eventType = MusicEventType()
var eventData: UnsafeRawPointer? = nil
var eventDataSize: UInt32 = 0
var hasNextEvent: DarwinBoolean = false
MusicEventIteratorHasCurrentEvent(iterator!, &hasNextEvent)
while(hasNextEvent).boolValue {
MusicEventIteratorGetEventInfo(iterator!, &eventTime, &eventType, &eventData, &eventDataSize)
if kMusicEventType_MIDINoteMessage == eventType {
let noteMessage: MIDINoteMessage = (eventData?.bindMemory(to: MIDINoteMessage.self,
capacity: 1).pointee)!
let noteEvent = MDNoteEvent(time: eventTime,
number: Int16(noteMessage.note),
duration: Double(noteMessage.duration),
velocity: Int16(noteMessage.velocity))
events.append(noteEvent)
}
if kMusicEventType_Meta == eventType {
var meta: MIDIMetaEvent = (eventData?.load(as: MIDIMetaEvent.self))!
if meta.metaEventType == 0x01 {
var dataArr = [UInt8]()
withUnsafeMutablePointer(to: &meta.data, { ptr in
for i in 0 ..< Int(meta.dataLength) {
dataArr.append(ptr[i])
}
})
let oldString = NSString(bytes: dataArr, length: dataArr.count,
encoding: String.Encoding.utf8.rawValue)
if let string = oldString as? String,
let beats = Double(string) {
tempo = beats
}
}
}
MusicEventIteratorNextEvent(iterator!)
MusicEventIteratorHasCurrentEvent(iterator!, &hasNextEvent)
}
}
let composition = MDComposition(noteEvents: events,
length: length,
tempo: tempo,
name: name,
id: "")
fullfill(composition)
})
}
Storing User Compositions into the Cloud
We used Firebase as a real-time data store for recording user compositions. By converting a user's tap into MIDI JSON, it enabled us to then publish the JSON file to Firebase for both storage and composition sharing. This technique further ensured that latency would remain below 300ms, and file sizes would never prove an issue.
Marketing
Pilot Project
The app made its debut on KUSC Kids Discovery Day at the Natural History Museum of Los Angeles County on April 9, 2017. It passed the test of garnering interest as children were most excited by the 'Simon Says' feature game that let them play compositions like Beethoven’s Symphony No 5.
Landing Page
Final Thoughts
I love making instrument apps. I love developing for iOS, and I'm thankful that tools like AudioKit and Squall exist. I'm especially grateful for both KUSC and my team, who helped make this project possible. I can't wait to build the next app.
This entire project was worth it to see the smiles on kids faces.