I recently competed in an event at the University of Rhode Island called Immerse-A-Thon. The event was a competition centered around designing innovative VR/AR applications. This article is a detailed telling of how I became involved with the event, what the stages of my app prototype development looked like, behind the scenes takeaways from mentor meetings that happed during the event, challenges encountered, solutions that I came up with, and a narrative of the competition from my perspective. I’ve included images and videos captured along the way. I’m learning a ton by reflecting on the experience through this writing. If anything, this should be an interesting look at what led up to the event, what happened during the event, and what the result was. Let us begin!
At the end of my Spring ’20 semester at URI in the Master’s of Adult Education Program, I became exponentially interested in the possibilities of XR (AR/VR) in the educational space. Looking back, there were a few elements that seeded my interest. COVID-19 had completed disrupted the way I was teaching classes for the community-based Adult Ed program that I work for. I was now teaching ESL exclusively online, and I was beginning to understand the strengths of online teaching and the opportunities that it presented. Additionally, through my studies, I was introduced to a corporate trainer from the local company AAA, and the tools that the trainer used to create their educational materials. One of the tools shown to us was Adobe Captivate, which basically allows you to create dialogue scenarios and quizzes in a nice-looking package. Finally, as a gamer, I had looked into VR again, and I was blown away by the advancements that had taken place over the past few years.
Fast forward a few months.
I messed around with Adobe Captivate for a bit, and it felt limited. I had some experience plunking around with development engines like Unity 3D and Unreal Engine, which basically allow you to create anything…as long as you know how to program it to do what you want. As cool as Captivate was, I wanted to do more than create dialogue branches and quizzes. As I became more interested in XR and experimented with an Oculus Rift-S, it became apparent to me that VR was going represent an absolute sea change in education once the tech become more easily available and developers got better at making educational content. I was determined to be a part of that sea change. I started looking people up at URI that were involved with VR.
The first person that I found was Dr. Bin Li, an Assistant Professor in the Department of Electrical, Computer and Biomedical Engineering at URI. Along with doing a bunch of really cool stuff around XR over wireless networks, Dr. Bin Li was the co-organizer of an event called the Immerse-a-thon at URI. The event was a competition centered around designing innovative VR/AR applications. There were mentors, a pitch element, and a $1000 prize for the winning team. During the meeting, Dr. Bin Li explained the power and potential of AR (Augmented Reality) to me. I registered for Immerse-A-Thon right away, and per Dr. Bin Li’s guidance, I began learning some more about what AR was, and also reengaged with my Unity 3D development skills.
Through a combination of Udemy classes and YouTube videos (I’ve linked a few of the most helpful), I was able to at least navigate the Unity interface and write extremely basic code to get things interacting with each other. As I mentioned previously, I had dabbled with game development over the past several years in both Unity and Unreal. When I had tried to learn the tools in the past, I would make something based on a tutorial, but I just never seemed to get over the “hump” of making something entirely mine, with my own thought processes. I had a decent understanding of navigating the programs, but things quickly fell apart when it came to the programming. At some point, other priorities would come up in my life and I’d tell myself, “This is just a game anyway. It doesn’t really matter that much.” And I’d fall off game development for 7 months to a year, only to start off at maybe “Square 1.5” the next time.
Developing apps for VR felt different for some reason. There was something magical about creating an environment and objects in VR and then putting my Oculus Rift-S on and being able to walk around that world and interact with objects. Plus, this time around, I was determined to attach my development interests to my professional trajectory as an educator. If I could put those two things together, I knew I’d open up a world of possibilities. I’m a builder. If I could harness the power of these tools, I could build things that would truly take advantage of my capabilities and make a large impact.
After a few months of prep, the first orientation day for Immerse-A-Thon happened on September 13, via Zoom. Organizers presented on what to expect during the event. Computer Science lecturer, Indrani Mandal, and PhD Student, Jiangong Chen, gave an introductory demonstration on how to get set up with Unity and begin working with the Google Cardboard SDK (Software Development Kit). My development in Unity for the previous few months had focused on using a PC-based VR solution such as the Oculus Rift, which offers 6DoF (Degrees of Freedom), meaning you can walk around, you have motion controllers and your hands to interact with things, etc. Google Cardboard is 3DoF, meaning your can look around 360 degrees. You don’t have motion controllers. You interact with objects by looking at them and hitting a button on the cardboard or your phone screen. This was a completely different way to think about VR.
With that presentation, all of the VR interaction and movement development techniques that I had been experimenting with and learning about just went out the window. It wasn’t a requirement for me to create something that used Google Cardboard, but my goal was to create something that students in a community-based ESL program could use. Upon further thought, I realized that organizations hosting the community-based programming would not realistically have access to high-end PC VR gear. Google Cardboard made sense. Almost all of my students had smart phones that could run it. It presented major limitations versus more robust VR systems, but those limitations also got me thinking about using the core tech differently.
The final day of the Immerse-A-Thon was to happen in 2 weeks, September 26. Before the final pitch event, I had to figure out what I was going to do, form a team (I needed at least one other person in my team), and generate something that was interesting and valuable to pitch to a panel of judges.
I began playing around with the demo scene that Indrani and Jiangong had shown us during the orientation day event. It was a basic scene that had a floating cube. If you looked at the cube and selected it, you could transform it into another shape. The demo scene provided everything that I needed in terms of camera movement and an event system for interacting with objects. If you attached an Event Trigger component to an object, Unity could take in some sort of instruction to do something. This is where my practice over the previous few months came in handy. I wasn’t great at programming yet, but I had messed around with Event Triggers and basic code. I also knew where to look for components when I needed them. I created a new scene, created a plane to stand on, and dragged in a few free, cutesy farm animals from the Unity Asset Store. My initial concept was to create a tool for teachers to load in objects, assign names and sounds to those objects, and allow students to use the app to help them study vocabulary.

I was able to get something up and running relatively quickly. When you looked at an object, such as a pig, a title would pop up that said “Pig.” When you looked at a cow, it said “Cow.” Basic stuff. I knew how to add sound from previous experimentation, so I planned on recording myself saying the names of the objects. I still didn’t have a team, which was a problem. I put out a call on the Immerse-A-Thon Slack about my project and aims, but didn’t get anything back. It was still relatively early in the competition.
As I learned more about creating interactions with objects in Unity, I had an important epiphany. As much tech as I had at my disposal, including a PC and a PC-VR headset, I didn’t have a 10 dollar Google Cardboard! All participants of Immerse-A-Thon were receiving a cardboard viewer for our phones, but those wouldn’t be mailed out until a week prior to the main event. Although a Google Cardboard Viewer wasn’t the most difficult thing to find on the internet, it was still a highly niche item. You couldn’t just go down to the gas station around the corner and grab yourself a Google Cardboard Viewer.
This is where my instincts as an experienced educator in generally low-resource and dynamic environments kicked in. If I didn’t have this stuff, how could I expect a school or community-based organization to provide Google Cardboard hardware? As cheap as a viewer was, it represented a friction point, and honestly, I doubted that VR with 3DoF was nearly as immersive as their 6DoF counterparts. Was it worth jumping through that hardware hoop to create an effective educational app? I seriously doubted it. From my experimentation with VR, holding something up to your face, that let light in and didn’t allow you to move around or interact naturally wasn’t going to produce enough immersion to warrant the hassle. That’s when I remembered something called Magic Window.
Magic Window is a Google technology that uses the gyroscope of a cell phone to look around just like VR—except you don’t put the phone on your face. For VR, you have 2 cameras, one for each eye, to create depth, in Magic Window, you have 1 camera and the user can also drag their finger on the phone screen to rotate the camera. No extra hardware is required. After looking at resources for developers provided by Google, I switched the app to work with the gyroscope and not use the 2 camera setup of Cardboard. I exported the demo to my phone. After testing it felt like the right move.
But…I still didn’t know what the heck I was going to do about a team. Who was I going to get to join my team? What if my teammates had their heart set on creating a VR or AR application? After all, the event was about VR/AR!
Just as I was deep in my thought processes about team formation, Carolyn came home from her field work that day. For a little over a year, she’d been intensely involved with studying terrapin turtles. I’d gone out to the field with her a few times to attach radio tags to baby turtles the size of Oreo cookies and observed where the little radio-tagged explorers went after they were released. I’m not averse to nature, I enjoy the outdoors, but I’m also somebody that loves working on computers and playing video games. Carolyn has gotten me outdoors much more than what would have otherwise happened over the past 4 years. She’s also taught me a lot about animals and plants and will frequently quiz me on bird or tree types when we encounter them on our hikes.
Bingo!
“Would you like to help me prototype an educational nature app for this Immerse-A-Thon event that I’m involved with?” I asked Carolyn as she put her field equipment down in our kitchen. She laughed. This type of question was a familiar setup for us already. I’d come up with some idea, plot it out in my mind, and then realize I wanted and needed her help to pull it off. Carolyn told me she didn’t know anything about app development; which I assured her I would handle. I told Carolyn that her role would be to provide meaningful and correct scientific information about animals. I could tell she liked this idea.
A little while later, while Carolyn was on the phone with her mom, she pulled her phone away from her ear and said, “Okay, I’ll do this, but on one condition…we need to win.” I smiled and went back into the studio to continue my work.
We now had a team that could compete in the competition. That night, we came up with a good team name “Tuxedo Cat.” Our cat Bella always features prominently in our projects!

I knew the pig and the cow in our current demo weren’t going to cut it. I needed to find a bigger set of animal models, and hopefully something a bit more interesting. I took a look in the Unity Asset Store for animals and searched through their free assets. There were some dogs and cats, fish and other marine animals, and also a pack called “Living Birds,” which featured several good-looking models of common North American birds complete with animations and even bird songs! Perfect! We had a week to go before the final weekend, and I had to figure out how to put these bird models to good use.
I ditched the cow and the pig from my demo scene and kept the platform directly in front of the camera. I then brought in a crow model and placed it on the platform. I created a script with a few functions that I could call which meant that when you looked at the bird, it presented info, and I set it up to play a bird call when you selected the bird. Carolyn jumped into her field biologist role, and made a spreadsheet that contained bird information for all of our birds. This was super valuable to me, because she knew exactly where to find this information and how to determine whether it was accurate. As that final week went on, I created an array of birds, and an interactable cube which allowed you to cycle through the birds. If you selected the cube, it would move the array forward, hide the previous bird, and activate the next bird. For example, the Crow was replaced by a Song Sparrow. This functionality was cool, but it didn’t really bring anything special to the table. I needed the app to do more.
Carolyn and I brainstormed. What else was out there for apps in this realm of bird identification? There were apps and websites which taught you what birds looked like and even played their calls. We became discouraged. Was what we were creating really a new idea? It didn’t look like it. We were seriously considering changing the app to feature fish or other animals. Maybe we could create a fish identification app for divers? Then I remembered an experience during an event put on by the Rhode Island Natural History Survey called BioBlitz that happens annually in Rhode Island.
During BioBlitz, a bunch of scientists and citizen-scientists go out and identify as many animals and plants in an area within 24 hours as they can. It’s an amazing event that allows you to learn about nature in our state and learn about a physical location in an extremely intimate way. I had done 3 BioBlitzes with Carolyn, and I remembered being taught bird calls one time at Snake Den State Park by birders. They had this great way of explaining bird calls. They would use mnemonics to describe the bird calls and help you remember them. For instance, an American Goldfinch’s call can sound like “PO – TATO – CHIP.”
What if in our app we split up the bird calls, and presented them with the mnemonic and really focused on teaching the mnemonic with the birdcall? It seemed like it might just work. I got to work, splitting up the calls, and building in functionality which allowed me to slot the call pieces in, with an appropriate mnemonic piece, and adjust the timing of it all. Carolyn began producing mnemonics for the birds and worked with me to determine where the calls should be split up. We also decided on a name for the app, “Birdsong Explorer.” We were closing in on the main competition start on September 26.

Even though we weren’t using VR for our app, I still wanted to make the experience immersive using Magic Window. Personally, I find this tech to be really cool and accessible. I wanted the learning environment to be memorable. I swapped out the platform that the birds were sitting on with a free tree stump asset, put a ground texture down on the plane, brought in some free, low-poly trees to put around the player, and set up panels with free forest pictures from Pixabay to create an immersive forest feeling.

It’s amazing what you can do with a few well placed pictures. I used the same image around the sides, which were 2D, but had a number of 3D trees between the player and the panels. This created a great sense of depth. The canopy above was nice and thick, but I also pulled a few pictures from Pixabay for sky panels above the player. In a more advanced 3D game, these assets might actually be 3D models, be lit, and cast shadows, which require a lot of computing resources. On mobile, you have to get creative. But it really isn’t that hard to create a convincing environment when the character is stationary, and also, they are meant to focus on a singular object, which in this case is the bird, in front of them. By the the morning of the first competition day, we had a prototype that you can see in the video directly below:
Carolyn was out in the field tagging terrapins during the Immerse-A-Thon launch briefing on September 26. I paid special attention to the judging rubric portion of the presentation:
- Design Ideation – 25%
- Problem/Solution Fit – 20%
- Originality – 20%
- Scalability – 20%
- Implementation – 15%
Our working prototype would account for only 15% of our score! Furthermore, judges would not be using the app themselves. That meant that my Unity development skills and the work that we had put in to the prototype didn’t mean this was going to be a slam dunk. It wouldn’t be a showdown between developers. If the other teams came up with great ideas and presented on them well, they would absolutely have a chance to win this thing.
As an infantryman in the Marines for several years and an adult educator in environments where conditions can change very quickly, I have learned how to pivot decisively and forcefully. Adapt and overcome or die. It might sound harsh, but that was the school that I was brought up in. I made a snap evaluation of what resources would be most useful to me in that moment. And what was the best way to move forward. I was done working on the prototype. We nailed the 15% implementation portion, now we needed to come up with the other 85%.
As part of the Immerse-A-Thon, 17 different mentors from a wide range of disciplines were available to meet with groups for 30 minute sessions. I needed good ideas and guidance regarding how our app was going to best meet the criteria of the judging rubric. It was clear that the mentors would be an extraordinary asset. After the mentors introduced themselves during the morning session, I asked if I could make more than 1 mentorship appointment. I was given the green light and proceeded to make appointments with 4 mentors. I probably could have done one or two more, but I also didn’t want to break my brain ahead of the pitch competition.
The first mentor that I met with was Jiangong Chen. He was the PhD student that had first demonstrated how to get Google Cardboard setup in Unity. I had a few questions I wanted to bounce off of him regarding the technical aspects of the Birdsong Explorer prototype. I wasn’t going to make any changes to the prototype at that point, but Immerse-A-Thon was proving to be an amazing learning experience for me in terms of app development, and I wanted to follow through on some lingering questions. I got some answers during the session, but most importantly, when I asked Jiangong about the gyroscope, we began talking about the sensors that come with most modern cell phones—things like the camera, microphone, accelerometer, GPS, etc. This got me thinking about other functionality that could be produced by utilizing those sensors.
Carolyn came back from the field and joined me for the second mentorship appointment that afternoon. Our mentor was Siu-Li Khoe. She was the Executive Director of Rhode Island Virtual Reality (RIVR). She had also been involved with the development of several international start-ups. I knew that Siu-Li was someone that I wanted to introduce myself to in terms of getting involved with XR in the New England. Once Carolyn explained what we were trying to accomplish, Siu-Li mentioned a plant identification app that she’d been loving lately called PictureThis. In this app, you basically take a picture of a plant, and the app will identify the plant. Interesting. Siu-Li began helping us think about who would use our app. We imagined it could be for people that were interested in birds, but needed something more accessible than a field guide.
Our next mentoring meeting was with Chris Jarvis, a bestselling author and entrepreneurial strategist. When Chris had introduced himself that morning, he said he would disrupt the thinking of any team that met with him. I believed him, and I was on the hunt for good ideas. So I was eager to hear what his take on Birdsong Explorer would be. When we told Chris that our target was people that were kind of interested in birds, he laughed. Chris started asking questions and began challenging us to drill deeper. By the end of the conversation we ended up an app concept that used a collection experience like Pokémon Go and an educational experience like the Discovery Channel. He disrupted our thinking alright! We had a much more compelling pathway as a result.
Finally, I met with Indrani Mandal. She had done the Unity setup demonstration during the orientation. She also had expertise in education and creating educational apps. As an educator, I knew that the app needed to be tightened up to maximize learning. I had actually experimented with implementing a quiz with our prototype the previous week (2 nights worth of work!), but I had ultimately shelved that element because I had encountered problems and didn’t know how to make it work seamlessly with our main portion of the app. Indrani suggested that I take a note from Khan Academy Kids, which has extremely tight loops in terms of short lessons and short formative assessment quizzes. I wouldn’t be able to implement this into our prototype, but I knew exactly how I could present it.

After the mentoring sessions, Carolyn and I ate dinner and got straight to work on our presentation. We knew the presentation would make or break us. We looked at the rubric and organized our PowerPoint slides to address the headings. This is where Carolyn really stepped up for the team. She needed to make presentations frequently for her work at the Graduate Writing Center at URI. She got a great looking presentation template up and running quickly and hashed out a format. I brought in bits of information about the app and created any mock-up images that we needed for the presentation in Photoshop. By the time we reached midnight on Saturday, we agreed that the presentation was coming along. It still needed to be tightened up a bit, but we also had a 20-minute pitch coaching session scheduled for 9 am on Sunday morning. We needed to get some sleep!

Our pitch coaching session was with Nancy Forster-Holt, an Assistant Professor of Entrepreneurship at URI, Josh Daly, the director of the southern region for the RI Small business Development Center at URI, and entrepreneurial strategist and one of our mentors from Saturday, Chris Jarvis. During the session, we would get 5 minutes to deliver our pitch. The other 15 minutes would be dedicated to feedback and adjustments. Carolyn and I had began practicing our pitch the previous night to identify areas that needed work and tighten it down to 5 minutes. We ran through it a few times before entering the call with the pitch coaches. Our presentation wasn’t perfect, but it was good enough to receive feedback on.
We delivered the pitch, and feedback was pretty good. Chris said he was impressed by how much we had pivoted from our previous idea. When we pitched to judges later that afternoon, we would have 5 minutes to deliver the pitch and 3 minutes for a Q&A session. Now that our app included features that required photo and sound identification by the app to verify collection of those items, I was worried that I didn’t know enough about the tech and how to make that happen. The coaches told us to point to apps that were already using similar tech like PictureThis for photos of plants and Shazam for songs. We didn’t need to explain how the sausage was made, we just needed to show that it was possible and that other apps were doing it. Just like using examples of Pokémon Go and Discovery Channel, it was important to point to successful and proven use cases.
Carolyn and I left the session feeling pretty good about the presentation. I wanted to look up the tech behind PictureThis and Shazam to understand it better and use terminology related to it. I also wanted to create a mock-up image for the formative assessment portion. I can talk about formative assessments all day, that’s really in my court, so I wanted relevant imagery that I could refer to. After some searching online, I discovered PictureThis and Shazam use machine learning algorithms to produce their results. Computers look for “fingerprints” in pictures and audio files and “learn” to identify different objects or songs based on a set of criteria. The formative assessment mock-up was easy. I thought of the tight loop that Indrani had mentioned and created an image of what I thought that would look like.

We got the slides we wanted in, trimmed what we didn’t need out, and practiced our pitch a few more times. We only had a few more hours before the pitch competition began. Carolyn and I decided to go for a walk to help clear our heads a bit. During the walk we tried to anticipate questions that the judges would ask us. One of those questions was, how would this thing actually make money? Carolyn and I had never been great about asking or answering that question, and it was one of the top things that the mentors kept bringing up.
They’d ask questions like: Who is your customer? How much will this cost?
We decided that the base version of Birdsong Explorer would be free. I then took a page from one of my favorite free-to-play games, Rocket League and its paid-tier, Rocket Pass. In Rocket League, the base game is free. You pay $10 to get access to 3 months of special challenges and cosmetic items for your car. As a player, I have no problem shelling out that $10, because it makes the game way more fun. For Birdsong Explorer we’d provide the base app for free on the app store. Users could then pay $5 for 3 months of special content including extra birds and challenges.
We got back from our walk just in time to practice our pitch a few times. On our last practice run, we finished with 10 seconds to spare. We were as ready as were ever going to be. This was it. We logged into the livestream of the event. The host introduced the judges, all of whom were extremely accomplished! Their expertise ranged from running multi-million dollar companies, video game development, project development for NASA, and other engineering endeavors. After the judges were introduced, Carolyn and I looked at each other, we both agreed we were a bit nervous. That was a good sign though. We were excited, had worked very hard, and needed to prove to these knowledgeable thought-leaders that our app was the best at this event. We waited eagerly for the host to call us up to present. Before long, our time finally came. See the video below for our pitch!
We nailed the presentation and the judges seemed to enjoy what we delivered. During the Q&A session, we had answers readily available for all of their questions. The other teams competing in the Immerse-A-Thon had great ideas, but I felt our presentation was a bit tighter than the competition. We also had a working prototype, which even though it counted for 15%, still counted for something, and seemed to engage the judges more. Hopefully, the judges felt the same way.
After the teams finished presenting, the keynote speaker for the event, Jonathan Flesher, an Emmy award winning producer of VR animation and who has a ton of other accomplishments, was interviewed by…Chris Jarvis. Chris was clearly a force to be reckoned with. Jonathan Flesher shared a lot of insight about where he thought XR technology was going in the near to mid-term future and also tips for entrepreneurs getting into the space. After he finished speaking, it was time for the winning team of Immerse-A-Thon 2020 to be announced. See the video below for the big reveal!
We won!
Carolyn and I each won $500. I gained a huge performance accomplishment regarding app development. It was an amazing experience that if you’ve read this far (thank you by the way) was complex with many twists and turns. As I’m writing this, it’s one week after the event. Our inboxes have been flooded with congratulations and just amazing staff from URI looking to support us in our endeavors. Chris Jarvis has agreed to mentor us. We met with him a few days ago, and we have other meetings coming up. I don’t know what is on the horizon, but I am so glad that I began searching about VR this past summer and reached out to Dr. Bin Li!