Last week, we did our first small-scale launch of the Audible application!
We haven't submitted the app to the AppStore yet (but hope to in the next few days), so we used TestFlight to distribute the app to 3 different groups of people who might enjoy Audible: a 6-person group that writes short stories together, a 3-person group that goes kayaking together on weekends, and a 3-person family that drives around together often.
Overall, although the TestFlight process was very cumbersome, and definitely not something we want to have to use for future launches (see here for a better understanding of the TestFlight process) we were able to get some valuable insights from our data. Based on our Contract of Deliverables that we submitted two weeks ago, we established 6 metrics that we think are valuable indicators of the progress and success of the app:
- Timestamps of when user entered and exited universal library, and timestamps of each song added from library - this will tell us how much time the user spends in the universal library per song added.
Goal: 5 seconds/per song added
- Cumulative time spent engaging with the app*
Goal: 15 minutes
- Proportion of users in a group that contribute to the queue*
Goal: ½
- Cumulative time spent pushing/pulling to sync universal library
Goal: 5 seconds/user
- Timestamps of the beginning and ending of the group creation / role choosing process = this will tell us the total time a user spends in this process before they start listening to music
Goal: 2 minutes/user
- Rate the app in the following categories:
1) Overall rate the app
2) How easy was the app to use?
3) How likely are you to recommend this to a friend?
Goal: average score of 3 in each category
During our launch, we kept track of all of these categories, by embedding code into the app that sent information to our databases. To see how we did, take a look at our Launch Data summary presentation, complete with graphics about user engagement, performance, and virality!
No comments:
Post a Comment