MAKE YOUR TUNE WITH EVERYDAY OBJECTS
This three-week project is about expressing yourself through music. Gizmu is a music app that utilizes augmented reality to help users become composers and create interesting music with common objects accessible to anyone.
How might we engage everyone to be a composer and create interesting music by using common objects?
Athena Hosek, Eva Chen, James Lee, Kristine Yang, Marina Guthmann, Yen Huang
sketches, user flows, wire-framing, prototyping, documentation
Figma, Adobe Illustration, Photoshop, After effects,
This project is about expressing yourself through the creation of music, with the help of everyday objects!
Anyone can be a composer and create interesting music with common objects that are accessible to anyone.
Empower people to become composers
Here comes Gizmu
- an AR app that help make your tune with everyday objects -
- Let's have fun! -
(A kind reminder: this video contains sounds, so please turn down your volume if needed)
Initially, we wanted to use AR devices such as Magic leap glass to develop the app because users can free their hands by using the hand-tracking function on the glass. However, after measuring the team's development familiarity, users' access, and building capability on different platforms, we decided to build a mobile app for the more public to play with.
🎯 Magic Leap vs. Mobile
Choosing which device to develop is never easy. Considering the aspects in the form, developing the app on mobile is more accessible for the users and controllable for our team. In addition, we can build multi-platforms for Andriod and IOS users
🔍 Main Features
How can we inspire users to create music? In the beginning, we ideated and brainstormed several ways on how to engage users in this wonderful music experience and inspire creativity in music composition. However, considering the short time, we decided to work on a minimum viable product(MVP) with major features in these three weeks and test with the users later to decide the future product path.
🎯 MVP Features
In these three weeks, we will develop an MVP focusing on the music creation process that contains the object detection feature, which utilizes real-life everyday objects for creation sources, the music assignment feature, which assigns and visualizes music to different common objects, and the sound library feature, which is a collaborated database of creative sound for users
🔍 User Flow
After picking up the major features for the MVP, we created a whole user flow that depicts the path of users' onboarding and interactions with the app. The workflow is as follows:
Onboarding -> Detect objects -> Assign sounds -> Reassign sounds on objects -> Share the creation
🔍 Figma Prototype
After clarifying the user flow, we wire-framed and created both low and high-fidelity prototypes to shape a real user interface that is user-friendly and accessible and handed this prototype to the development team. In this step, we produced the clickable prototype using Figma to simulate user interaction when compositing music using detected real-life objects and produce advanced-level motion design and animations between pages and in the components.
During the process, we overcame challenges like how to effectively communicate design ideas with everyone in the team and acknowledging and transforming technical difficulties into creative designs with the development team.
A closer look
Mickey Mouse Menu
(Adjust sound volume, play/stop, and reassign)
🔍 Look & Feel
It's important to have a good name for the product and make visuals consistent throughout the whole experience. In this step, We designed the followings:
3. Icons and Components
For consistency, we utilized the main colors for the icons and components in the app. We also decided to make icons 3D-like because this app connects real-world objects with digitalized music.
🔍 Dev Process
Here comes the development process! It's never an easy thing, but thanks to my cute and talented team members, we eventually came out with a functional app for all our classmates to test on their own phones. Big learning in this step is knowing more about the technical side, refining designs based on the development team's feedback, and convincing the developers to act on some difficult but valuable ideas.
Step 1: Searched for Unity projects using YOLO obj detection
We found YOLO - a state-of-the-art, real-time object detection system. It is a pre-trained model of the convolutional neural network that can detect pre-trained classes. This algorithm is cool and a good try but it kept freezing when we tested it and detected the wrong objects. Therefore, we need to figure out another way to detect objects.
Step 2: Built basic functionality using placeholder objects
In this step, we built the basic functions like placing the menu on the placeholder objects and choosing songs from the sound library. We overcame problems like how to make the menu always face the camera and how to switch different levels for different functions.
Step 3: Got our selected object detector running
Thanks to our brilliant developer, we figured out another way to detect objects at a much faster speed than YOLO. The item needs to be detected in this rectangle area in the middle of the interface.
Step 4: Combined object detection with basic functionality, setup control flow
In this step, our developers combined object detection with the basic functionality and set up the control flow to connect basic functions.
Step 5: Embedded music menu VFX
The music menu is important in our product as it visualizes the rhythms and tempo in music and engages users in the whole music creation process.
🌀 Coding chaos
During the process, there is definitely a lot of coding chaos, which is listed below. Luckily, we were able to figure out collaboration issues between mac and windows and different difficulties when developing for Android and IOS users. We learned a lot about how to communicate and collaborate effectively, thus, succeeding in generating our baby MVP
Setting up and dealing with Unity and Git
Collaboration issues between Mac and Windows (solved with .gitignore)
Mysterious Unity/Xcode errors that fixed themselves
No debug log messages for Android
Limited error info for iOS
Much waiting time for builds
Some things in Unity worked exactly as expected, while others mysteriously didn’t work (for example: SetActive on a parent component vs. on each child)
Sleep-deprived developers making silly mistakes 🤡