TL;DR you can download ergo/ here!
This article originally appeared on Medium
Dear Reader,
In the summer of 2018 I graduated from Stanford University and prepared for the move across country to start medical school. After packing my belongings and hauling furniture all day, I took a break and leaned on a stack of moving boxes, neck hunched over, checking emails on my phone. Suddenly, I had an idea. I opened the notes app, and hastily wrote down the following:
app idea: phone vibrates when it’s too angled to prevent neck strain
Ideas often occur to me at random times and I always like to write them down. Many of them eventually get trashed, but some of them inspire new projects. In the whirlwind of beginning med school at Columbia VP&S, this particular idea became buried as notes about biochemistry 💊, genetics 🧬, and anatomy 🦴 filled up my phone.
Fast forward two years. I’m taking a study break, clearing out old notes on my phone, and the “app idea” shuffles to the top of the list. I re-read the line of text. In med school I learned about the musculoskeletal system and reviewed numerous studies associating smart phone use with neck pain, muscle fatigue, and changes in spine biomechanics (see references at end). I determined this app could be helpful for anyone who uses a smart phone to work, play, or communicate on a daily basis.
So, with zero experience in iOS app development 📱 and Apple’s SwiftUI programming framework, I started my foray into tech for the sake of good posture. Below, you’ll find 10 journal entries based on the trials and triumphs of key steps in the 31-day process to create ergo/ (sprinkled with apropos GIFs and medical analogies for good measure).
Let’s begin!
/day 1: designing the user interface
how do I change the color?
Who knew that changing the color of my app’s main screen could be so frustrating. Not that it was at all difficult, but I had a steep learning curve to surmount. You see, SwiftUI is a framework for designing app user interfaces; it is a toolkit at the developer’s disposal to dictate exactly how they want an app to look and behave. When I looked into this toolkit for the first time, I had no idea how any of the tools functioned. I didn’t even recognize the words used to describe the tools. This wasn’t like gross anatomy, where I had a sense of how different parts were assembled and organized. I was starting from scratch on Day 1.
SwiftUI is declarative, meaning, you tell it to do something and it does exactly that (e.g. attending 🥼 tells med student to go get the patient’s history in the ER, so med student scurries away and elicits the history from the patient). You might be wondering, “Isn’t all programming like that?” Well, not exactly. Other frameworks operate on an imperative paradigm, meaning, you have to tell it how to do something step by step (e.g. attending tells med student to ask patient about why they came to ER, has this happened before, what medications they’re taking, what allergies they have, what other conditions they have, if any family members have had related illnesses… etc.)
Back to color. At its core, SwiftUI is very simple and changing the color ended up being very simple, too. Here is how I did it:
struct HomeView: View { var body: some View }
The first two lines set up the actual view that the user sees. The ZStack creates a stack of objects in the z-axis (going into / out of the phone screen). I use the Color() “tool” in SwiftUI and tell it to make the color Constants.ErgoColor, a hex code value that I set. Et voilà! The app’s screen background is now a custom color. I can then add more objects in the ZStack and they will appear in the app’s foreground.
It took me a whole day to figure that out. This was going to be a slow climb up the learning curve. 🧗🏻♂️
/day 3: the tutorial screen dilemma
surely someone must’ve figured this out already?
Almost every app I have downloaded on my phone presented some sort of tutorial screen when opened for the first time. I knew ergo/ had to have a “how-to-use” screen to orient users. After changing the color and adding some basic text elements, I needed to figure out how to add a different screen that only appeared once: the first time the app opened.
This step took me down a rabbit hole of programming states, key-value pairs, delegates, booleans, user default storage, and environment objects. As it turns out, SwiftUI had an exceptionally simple solution that took me an exceptionally long time to figure out — #learningcurve. 📈
Here’s the basics. The app can store small amounts of data in its UserDefaults folder. I needed to create a boolean (fancy computing term for true or false) to let the app know that the user was opening the app for the first time, so I set the initial value to false. As soon as the user opened the app and viewed the tutorial, the boolean is set to true, and this value is then shared as an environment object (i.e. the app’s code can reference this value anywhere it’s needed). Once this occurred, the tutorial screen never appeared again.
Here’s what this looked like:
class ScreenViewController: ObservableObject { init() { if !UserDefaults.standard.bool(forKey: "first") { UserDefaults.standard.set(true, forKey: "first") showScreen = "tutorialscreen" } else { showScreen = "mainscreen" } } @Published var showScreen: String }
English translation of the above coding logic: if the key 🗝 stored in UserDefaults called “first” has a value equal to false, show the tutorial screen and change the value to true. If the value is already true, only show the app’s main screen. The line at the very end of this code block communicates to a separate function that handles the presentation of screens.
Just as interviewing patients and formulating a care plan became easier after a couple months of clinical work, this coding business was starting to make sense…
/day 7: crashing the system
why am I doing this?
…Until it didn’t.
“All I wanted to do was change the name!”
Fun fact: ergo/ was not the original name. The first name I came up with was Poise.
“poise (noun) \ ˈpȯiz \ : a stably balanced state (Merriam-Webster)”
It sounded calm, light, and gracious. A perfect name for a posture app. Until it was pointed out to me that Poise was already taken.
“Poise® : Nobody wants to worry about bladder leakage. (Poise Pads)”
To avoid brand confusion, I decided it was wise to change the name. ergo/ is an abbreviation for ergonomic; it’s short, fun to say, and catchy. I went into the settings of XCode, Apple’s app development program, deleted “poise” and changed my project name to “ergo.”
A dozen different errors ensued. Pop-up notifications screamed at me in vague computer terminology. I tried to run the app and it failed miserably. 🥴
“What have I done?”
XCode is notorious for being finicky. If you do not follow the steps of a process precisely, bad outcomes await you. I did not follow the steps — XCode kicked me in the shins and laughed. I had no idea how to fix what I had done; reversing the steps made things worse. I closed my laptop and walked away.
I came back a day later determined to find a solution. By this point, I only had around 150 lines of code in the now broken project. I copied those 150 lines, pasted them into files in a brand new project, and renamed the project “ergo”. Not a very elegant solution, but it worked.
When you’re working diligently and get stuck on diagnosing a patient or planning a course of care, sometimes you have to start over with the basic facts, re-read the history, double check the medical records, and consult existing research or other team members. Just as I’m resolute in my commitment to care for patients with my team, I wasn’t going to let a finicky computer program stop me from building an app to help people improve posture. 💪🏽
The next step was to build the actual sensor functionality: the most difficult challenge yet.
/day 9: defining gravity
what in the world is .xMagneticNorthZVertical?
ergo/ needed to sense the user’s phone orientation in 3D space so that it could send a notification when the phone was not held in an ergonomic position. To illustrate what this means, take a look at this diagram:
When the user’s phone tilts forward, their neck angles downward to look at the screen. When the phone is held in a more upright position 🤳🏽, the neck and head become aligned with the rest of the spine.
Most smart phones come with a built-in gyroscope (uses Earth’s gravity to determine orientation) and accelerometer (measures non-gravitational acceleration) to sense device motion. The trouble is accessing this complex data and extracting values in a meaningful way. I took math and physics in college, but this felt like a different ball game. How on earth could I process the data to remove bias from other factors? The user could be holding their phone upside down, putting the phone on a desk, walking along the street, riding a car, or taking the train, and the sensors would have to recognize when the phone was angled in a truly improper position.
Apple must’ve heard my prayers, because they released updated documentation on Getting Processed Device-Motion Data. With this tool, I could get all the data I needed. They even provided a fully functional code template! This is like a fellow med student creating and sharing an Anki deck with all the relevant material for the final exam — bless you, Anki god! And Thank you, Apple. See what the code template looks like:
func startDeviceMotion() { if motion.isDeviceMotionAvailable { self.motion.deviceMotionUpdateInterval = 1.0 / 60.0 self.motion.showsDeviceMovementDisplay = true self.motion.startDeviceMotionUpdates(using: .xMagneticNorthZVertical) // Configure a timer to fetch the motion data. self.timer = Timer(fire: Date(), interval: (1.0 / 60.0), repeats: true, block: { (timer) in if let data = self.motion.deviceMotion { // Get the attitude relative to the magnetic north reference frame. let x = data.attitude.pitch let y = data.attitude.roll let z = data.attitude.yaw // Use the motion data in your app. } }) // Add the timer to the current run loop. RunLoop.current.add(self.timer!, forMode: RunLoop.Mode.default) } }
I took one look at this for the first time and thought “Huh?” 🤷🏻♂️
It took a couple hours to understand what was happening, but I broke it down line-by-line. The first five lines check if the sensors are available, set a frequency to fire the sensors, and establish the orientation of spatial axes. The term .xMagneticNorthZVertical means that the x-axis of the phone points toward magnetic north, and the z-axis of the phone points perpendicular from the screen. The more you know. 💁🏻♂️
The rest of the template sets up a timer to gather data at the same frequency as the sensors (the sensors will fire and don’t explicitly communicate with anyone that they’ve actually started — any student late for morning lecture can commiserate) and records data in the x, y, and z axes.
With the code template, my app started working as I had envisioned. This step wasn’t nearly as challenging as I had thought.
I should have known that XCode was going to blindside me.
/day 16: bug hunter extraordinaire
can anybody help me?
What should have happened:
User holds phone in improper position, sensors recognize this, and app adds +1 to a posture correction counter on the main screen.
What actually happened:
*whole lot of nothing*
No matter what I did, the counter would not update in real-time. I scoured StackOverflow (popular Q&A site for programmers), the Apple Developer Forums, SwiftUI tutorials, programming books, an hour-long World Wide Developer Conference video presentation, and even posted on a forum. No luck.
The patient was obviously not well, but system scans showed no signs of illness. No error messages appeared. No pop-up notifications screamed at me in vague computer terminology. Nevertheless, my code had a bug. 🐜
This went on for a week until one kind soul replied to my forlorn forum post.
“It looks like you’re referencing a new instance, when you should instead reference the environment object instance you already created.” — kind soul
The feeling was electric. The answer I sought was so simple and wildly disproportionate to the level of vexation the bug had caused. Once I referenced the existing environment object, the counter worked seamlessly.
Without getting into the specifics of value types and reference types, structs and classes, and published and environment objects, I will extend the same analogy used previously. If the attending you’re working with says “go to the waiting room and interview our next patient,” you must go to the exact waiting room they are referring to. If you go to a different place (such as the outpatient waiting room instead of the ER waiting room), you’ll return to your attending with zilch, whose blank stare *politely* questions your competency.
For the love of all things declarative, in medicine and in programming, make sure you are referencing the same exact item in question. Whether it’s an environment object, a waiting room, a key-value pair, or a bottle of medication, many of our errors can be prevented by double checking that we collectively reference the same item. If there is any doubt, ask for clarification. Your medical team (and your coding sanity) will ultimately thank you. 🙂
/day 22: notification calibration
how will users know to improve their posture?
With the big bad bug squashed and the app running smoothly, I now needed to create notifications that would fire whenever the phone was held in an improper position.
To achieve this, I created a dictionary of key-value pairs (remember from Day 3?). The key being a unique number identifier, and the value being the posture cue to be presented. A Medical Record Number associated with a patient’s chart is another example of a key-value pair. After researching helpful cues to improve posture, I generated the following (abbreviated list):
let correctionDictionary: [Int:String] = [ 1 : "keep your head high and back straight!", 2 : "bring your phone up higher", 3 : "try keeping your chin parallel to the floor", 4 : "ears should line up with your shoulders", 5 : "pull shoulders back, feet shoulder width apart", 6 : "remember: knees shouldn’t be locked 🐝🦵🏽", 7 : "if phone is below sternum, move it higher!", 8 : "stand like a superhero wearing a cape!", 9 : "make sure hips are square and not tilted", 10 : "tip: elbow should be held out in front of chest 🤳🏽" ]
When the user’s phone is held in an improper orientation, they receive notifications that look like this:
Simple, short, and sweet. Friendly reminders to improve posture, and a cue that provides the user an immediate way to do so. After the notification is presented, it is cleared from the phone.
Why?
Well, have you ever opened your phone to find a dozen notifications from 5 different apps that you don’t really care about? I designed ergo/ to be as minimally intrusive to your day as possible, and #notificationfatigue 💤 is a real problem. Technology should help, not hinder.
With notifications working correctly along with the device sensors and user interface, the app was nearly complete.
Did I think about submitting it directly to the App Store for review? Yes.
Did I do that? No!
People needed to test it out first.
I enlisted the help of friends and family to download ergo/ and give it a go.
/day 24: testing the MVP
will people actually like this?
Sharing my minimum viable product (MVP) with other people brought a mix of excitement and anxiousness.
On one hand, I had created (what I thought was) a useful free app that anyone could benefit from.
On the other hand, would people actually like it?
Friends and family tried it out. I told them “try to break it” so I could verify if my programming and bug-hunting abilities were decent enough.
First, some of the compliments:
This is really cool!
Nice! How did you come up with this?
Great, but aren’t you supposed to be studying medicine?
Then, some of the suggestions:
You should make a calendar so people can see their improvements over time
I wish there was a dedicated page explaining all the components and features
You should really change the color…
I appreciated all the kind comments (yes, I’m still studying medicine, dad) and thought about how to incorporate the suggestions. I changed the color to the beautiful ergo/ blue used throughout images in this article (it was initially a mint green 🌿), added a dedicated features page based on the information in the tutorial screen, and implemented a sleek way to access different calendar dates to graph the user’s posture improvement data over time.
Training mode allows you to see how tilting your phone affects the on-screen animation. Background mode lets you check your posture at different frequencies throughout the day without needing to open the app. the bar charts show your posture points and posture corrections from the last week.
With feedback, my minimum viable product became a full-fledged application ready for submission to the App Store. I reviewed the official Apple submission checklist and discovered I was missing one final component.
/day 29: apple™ and oranges and data security
how do I make a privacy policy?
Apple takes privacy very seriously, and every app is required to have a privacy policy. 🔒 Today, many free apps lure people in and sell their user data behind the scenes. For example, social media apps can log your IP address, track your online behavior, and share your data with third parties. The user is monetized unwittingly because their data is valuable to companies.
ergo/ does none of this. All data is stored locally on your device, and is never shared with anyone. ergo/ was not designed to generate any profit; it was explicitly designed to improve posture, and that’s it! You can take a look at the full privacy policy here.
I didn’t have to reinvent the wheel to create the policy, either. Thankfully, there are many free resources to help people generate their own privacy policies. A quick google search for “privacy policy generator” does the trick. Once you have the template, you can edit it to fit the specific needs of your company or app.
With the privacy policy completed, I was ready to submit ergo/ for review by the mysterious people controlling the iOS App Store’s silicon-coated security gates. I submitted the source code, artwork, policy, and text descriptions. I even submitted an extra document on how to use the app, which is recommended to make the process easier for the reviewers.
After almost a month of work, this tech roller-coaster had finally crested the peak of the learning curve. 🎢 I threw my hands in the air and waited in anticipation for the good news to roll in.
/day 30: submission denied
now what do you want?
Ugh. I knew apps could be denied on the first submission, but was hoping it wouldn’t happen. I had prepared all the materials and followed the submission guidelines — what more could they want!
As it turns out, the supplied privacy policy link didn’t work as desired by the reviewers. The link went to a website that contained the privacy policy, but it did not go directly to the privacy policy (the universe was telling me to be more specific and declarative). I easily fixed the link.
However, despite the screen shots and screen recordings I submitted, the reviewers also wanted a video of how the app worked on a physical device. Fair enough. I borrowed another phone to record 🎥 ergo/ operating on my phone (#meta), and narrated the functions and features as I interacted with them. I re-submitted the app for review without any fanfare and waited.
/day 31: it’s the climb
okay, now what?
I waited less than 2 hours after resubmission and the reviewers responded:
Hallelujah! After 30 days of programming and 1600 lines of code, ergo/ was live on the App Store. The journey felt a little surreal, because 30 days before this approval message, I didn’t know how to change colors. As with so many things worth doing, I focused on taking small steps to climb the mountain of required learning.
Today, although there is much more to learn, I now know how to approach app development and am more comfortable with the creative processes responsible for so many medical and health technologies. For example, many of your favorite exercise tracking apps use the same gyroscope and accelerometer sensors to measure your activity. In hospital electronic medical records, numerous calculations are happening in the background to generate tables and charts that help providers make informed decisions with their patients. Health care systems are rolling out telemedicine portals with customizable user interfaces to serve more patients at a distance.
I created this free app to improve posture, but I also learned how to take an idea and bring it to life as a resource that helps people at scale. As I write this paragraph, ergo/ is being downloaded in countries across the world. 🌎 It would be impractical for me to travel to each of these countries and help people improve their posture by myself, but with a simple app I can positively influence every day ergonomics.
Future health care providers will benefit from understanding the basic foundations of technologies powering our patient care. If you have an idea with promise, pursue it. If you don’t know where to start, feel free to reach out — because technology is evolving health care every day: right under our noses, stethoscopes, and scalpels.
Best,
Sandro
ergo/’s got your back! download it for free on the iOS App Store.
The research below inspired the creation of ergo/
📚
Alshafai, N., & Aldhafeeri, W. (2018). The effect of modern technology on cervical spine biomechanics. Literature review. Canadian Journal of Neurological Sciences / Journal Canadien Des Sciences Neurologiques, 45(s2), S54–S54. https://doi.org/10.1017/cjn.2018.244
D’Anna, C., Varrecchia, T., Bibbo, D., Orsini, F., Schmid, M., & Conforto, S. (2018). Effect of different smartphone uses on posture while seating and standing. In MeMeA 2018–2018 IEEE International Symposium on Medical Measurements and Applications, Proceedings. https://doi.org/10.1109/MeMeA.2018.8438686
Eitivipart, A. C., Viriyarojanakul, S., & Redhead, L. (2018). Musculoskeletal disorder and pain associated with smartphone use: A systematic review of biomechanical evidence. Hong Kong Physiotherapy Journal, 38(2), 77–90. https://doi.org/10.1142/S1013702518300010
Han, H., Lee, S., & Shin, G. (2019). Naturalistic data collection of head posture during smartphone use. Ergonomics, 62(3), 444–448. https://doi.org/10.1080/00140139.2018.1544379
Hansraj, K. K. (2014). Assessment of stresses in the cervical spine caused by posture and position of the head. Surgical Technology International, 25, 277–9. Retrieved from http://www.ncbi.nlm.nih.gov/pubmed/25393825
Ko, P. H., Hwang, Y. H., & Liang, H. W. (2016). Influence of smartphone use styles on typing performance and biomechanical exposure. Ergonomics, 59(6), 821–828. https://doi.org/10.1080/00140139.2015.1088075
Lee, S.-Y., Lee, D.-H., & Han, S.-K. (2016). The Effects of Posture on Neck Flexion Angle While Using a Smartphone according to Duration. Journal of The Korean Society of Physical Medicine, 11(3), 35–39. https://doi.org/10.13066/kspm.2016.11.3.35
Losch, D., Groneberg, D. A., Ohlendorf, D., & Wanke, E. M. (2017). Text neck. Zentralblatt Fur Arbeitsmedizin, Arbeitsschutz Und Ergonomie, 67(4), 234–236. https://doi.org/10.1007/s40664-017-0190-4
Namwongsa, S., Puntumetakul, R., Neubert, M. S., & Boucaut, R. (2019). Effect of neck flexion angles on neck muscle activity among smartphone users with and without neck pain. Ergonomics. https://doi.org/10.1080/00140139.2019.1661525
Park, J. H., Kang, S. Y., Lee, S. G., & Jeon, H. S. (2017). The effects of smart phone gaming duration on muscle activation and spinal posture: Pilot study. Physiotherapy Theory and Practice, 33(8), 661–669. https://doi.org/10.1080/09593985.2017.1328716
So, Y.-J., & Woo, Y.-K. (2014). Effects of Smartphone Use on Muscle Fatigue and Pain and, Cervical Range of Motion Among Subjects With and Without Neck Muscle Pain. Physical Therapy Korea, 21(3), 28–37. https://doi.org/10.12674/ptk.2014.21.3.028
Environmental Health & Safety (2013). Ergonomics Guidance for Mobile Devices Ergonomics Guidance for Mobile Devices. Stanford University.
Xie, Y. F., & Szeto, G. P. Y. (2015). A study of muscle activity in using touchscreen smartphone among young people with and without neck-shoulder pain. Physiotherapy, 101, e1668–e1669. https://doi.org/10.1016/j.physio.2015.03.068
Xie, Y. F., Szeto, G., Madeleine, P., & Tsang, S. (2018). Spinal kinematics during smartphone texting — A comparison between young adults with and without chronic neck-shoulder pain. Applied Ergonomics, 68, 160–168. https://doi.org/10.1016/j.apergo.2017.10.018
👃🏽🩺✍🏽
#medical student #posture #iOS app development #tech #entrepreneurship