top of page
Gizmu_cover.png

Gizmu

MAKE YOUR TUNE WITH EVERYDAY OBJECTS

This three-week project is about expressing yourself through music. Gizmu is a music app that utilizes augmented reality to help users become composers and create interesting music with common objects accessible to anyone. 

Ambition
How might we engage everyone to be a composer and create interesting music by using common objects?

Solution
UX, AR

Team
Athena Hosek, Eva Chen, James Lee, Kristine Yang, Marina Guthmann, Yen Huang

Responsibilities
sketches, user flows, wire-framing, prototyping, documentation

Tool
Figma, Adobe Illustration, Photoshop, After effects,
Github, Unity

Concept

This project is about expressing yourself through the creation of music, with the help of everyday objects! 

 

Anyone can be a composer and create interesting music with common objects that are accessible to anyone.

Goal

Empower people to become composers


Here comes Gizmu 
- an AR app that help make your tune with everyday objects -


Demo
- Let's have fun! -

(A kind reminder: this video contains sounds, so please turn down your volume if needed)

Design Process

(Click the button to navigate to each section)

Discover

Define

Develop

Deliver

Discover

🔍 Technology

Initially, we wanted to use AR devices such as Magic leap glass to develop the app because users can free their hands by using the hand-tracking function on the glass. However, after measuring the team's development familiarity, users' access, and building capability on different platforms, we decided to build a mobile app for the more public to play with.

🎯 Magic Leap vs. Mobile

Choosing which device to develop is never easy. Considering the aspects in the form, developing the app on mobile is more accessible for the users and controllable for our team. In addition, we can build multi-platforms for Andriod and IOS users

Magic Leap

Mobile

Hands-free

Dev Familiarities

User access

Different platforms

Define

🔍 Main Features

How can we inspire users to create music? In the beginning, we ideated and brainstormed several ways on how to engage users in this wonderful music experience and inspire creativity in music composition. However, considering the short time, we decided to work on a minimum viable product(MVP) with major features in these three weeks and test with the users later to decide the future product path.

🎯 MVP Features

In these three weeks, we will develop an MVP focusing on the music creation process that contains the object detection feature, which utilizes real-life everyday objects for creation sources, the music assignment feature, which assigns and visualizes music to different common objects, and the sound library feature, which is a collaborated database of creative sound for users

Could have

Must have

MVP

Object Detection

Music Assignment

Account

Sound Library

Work Library

Screen Record

Sound Record

Sound Upload

🔍 User Flow

After picking up the major features for the MVP, we created a whole user flow that depicts the path of users' onboarding and interactions with the app. The workflow is as follows:
Onboarding -> Detect objects -> Assign sounds -> Reassign sounds on objects -> Share the creation

Develop

🔍 Figma Prototype

After clarifying the user flow, we wire-framed and created both low and high-fidelity prototypes to shape a real user interface that is user-friendly and accessible and handed this prototype to the development team. In this step, we produced the clickable prototype using Figma to simulate user interaction when compositing music using detected real-life objects and produce advanced-level motion design and animations between pages and in the components.

During the process, we overcame challenges like how to effectively communicate design ideas with everyone in the team and acknowledging and transforming technical difficulties into creative designs with the development team. 

A closer look

Greetings

Onboarding

Detect Object

Assign Sound

Mickey Mouse Menu
(Adjust sound volume, play/stop, and reassign)

🔍 Look & Feel

It's important to have a good name for the product and make visuals consistent throughout the whole experience. In this step, We designed the followings:

3. Icons and Components
For consistency, we utilized the main colors for the icons and components in the app. We also decided to make icons 3D-like because this app connects real-world objects with digitalized music.

🔍 Dev Process

Here comes the development process! It's never an easy thing, but thanks to my cute and talented team members, we eventually came out with a functional app for all our classmates to test on their own phones. Big learning in this step is knowing more about the technical side, refining designs based on the development team's feedback, and convincing the developers to act on some difficult but valuable ideas. 

Step 1: Searched for Unity projects using YOLO obj detection
We found YOLO - a state-of-the-art, real-time object detection system. It is a pre-trained model of the convolutional neural network that can detect pre-trained classes. This algorithm is cool and a good try but it kept freezing when we tested it and detected the wrong objects. Therefore, we need to figure out another way to detect objects.

Source: https://github.com/leggedrobotics/darknet_ros

Step 2: Built basic functionality using placeholder objects
In this step, we built the basic functions like placing the menu on the placeholder objects and choosing songs from the sound library. We overcame problems like how to make the menu always face the camera and how to switch different levels for different functions.

Step 3: Got our selected object detector running
Thanks to our brilliant developer, we figured out another way to detect objects at a much faster speed than YOLO. The item needs to be detected in this rectangle area in the middle of the interface.

Step 4: Combined object detection with basic functionality, setup control flow
In this step, our developers combined object detection with the basic functionality and set up the control flow to connect basic functions.

Step 5: Embedded music menu VFX
The music menu is important in our product as it visualizes the rhythms and tempo in music and engages users in the whole music creation process. 

🌀 Coding chaos

During the process, there is definitely a lot of coding chaos, which is listed below. Luckily, we were able to figure out collaboration issues between mac and windows and different difficulties when developing for Android and IOS users. We learned a lot about how to communicate and collaborate effectively, thus, succeeding in generating our baby MVP

  • Setting up and dealing with Unity and Git

  • Collaboration issues between Mac and Windows (solved with .gitignore)

  • Mysterious Unity/Xcode errors that fixed themselves

  • No debug log messages for Android

  • Limited error info for iOS

  • Much waiting time for builds

  • Some things in Unity worked exactly as expected, while others mysteriously didn’t work (for example: SetActive on a parent component vs. on each child)

  • Sleep-deprived developers making silly mistakes 🤡

Deliver

🔍 Demo

This demo shows the following user flow:
Onboarding
-> Detect objects -> Assign sounds -> Adjust Sounds (Volume, Play/Stop, reassign sound)

(
PS. The demo is available on
TestFlight for IOS users and the APK file is provided for android users to test)

🎯 Future Development

Besides basic features in the MVP,  we want to expand more features to inspire more users to use the app by enabling users to record their compositions, upload their own sounds and share their creations with their friends. 

MVP

Future
Product Vision

Object Detection

Music Assignment

Account

Sound Library

Work Library

Screen Record

Sound Record

Sound Upload

More details on new features

1. The prototype for the Account, Sound Library, and Work Library features in future product

Account

Sound Library

Work Library

2. The prototype for the Screen Record feature in future product

Screen Record

3. The prototype for the Sound Record feature in future product

Sound Record

4. The prototype for the Sound Upload feature in future product

Sound Upload

✍️

Reflections

If I had more time...
1. Conduct more usability testings to iterate the design
2. Refine the sound VFX to bring more music dynamic to the app
3. Design to prevent unconscious errors and help users to survive through errors. ex. when users can't locate the detection plane or when users detect the wrong object

Learnings

During the process of developing our MVP within three weeks, I learned a lot as a design team member as I collaborated not only with other designers but also developers from the development team. I realized the importance of prioritizing the critical tasks and features and measuring tradeoffs between different designs considering both user experience and develop difficulties on the technical side. 

photo.png

GIZMU TEAM!!!!!!!! FOREVER!!!!!!!

1. Discover
2. Define
3. Develop
4. Deliver
Concept & Goal
Design Process
bottom of page