My perception of Veterans Day has shifted over the years. Before joining the Marines in 2005, to me, Veterans Day was a holiday for heroes, a day for honoring gods among humans—those that had sacrificed their lives and those that were selflessly serving their country. I wanted to join those ranks, and eventually I did. The USMC was eager to take me on as an infantryman in 2005 as the war in Iraq was intensifying.
When I came home from boot camp on leave in the spring of 2006, I was settling into my new identity as a Marine. Returning to my family home in Wisconsin felt strange. My surroundings were familiar, but I was different.
There was a Memorial Day ceremony coming up at the local cemetery, and my mother asked if I would put on my new dress uniform, my Alphas, and participate in the ceremony.
I remember feeling this extraordinary weight when she asked me to go. I told my mother that I didn’t want to do it. That I couldn’t do it. Who was I to stand next to the veterans at that ceremony? Some of them had fought in the Gulf War, the Vietnam War, the Korean War, and even WWII. I had earned the title of Marine, but beyond going through boot camp, I hadn’t done a damn thing. I didn’t rate. I was what we call in the military a “boot,” and I couldn’t possibly stand there next to these giants of the human spirit and what they represented. I didn’t quite have words for what that was, but I knew that it was too great for me to participate in.
My mother insisted that I go until finally, it got to a breaking point. I told her that I couldn’t do it. I was so upset that I actually shed tears. She relented, and I remember asking myself, “Why do I feel so conflicted on this topic?” Looking back, I can say it was because I was in the process of ‘crossing over’ into my military identity. I felt so conflicted because I was now supposed to be one of those unfathomable gods representing virtues that I could not previously comprehend as a civilian. At least that’s what my culture had taught me.
I went on a few tours to Iraq in 2007 and 2008. My active duty enlistment ended in 2009. While I was in college, I did some reserve time and got out in 2013. I was good during my active duty time. While I was in college, my duties as a sergeant in the reserves kept me from rattling apart. By the time 2014 hit, I was completely adrift.
My latest romantic relationship had collapsed fantastically. I was working 3rd shift stocking shelves in a grocery store. I was living in my parents’ basement. Drinking. Doing drugs. Not sleeping. I didn’t know it at the time, but I was desperately trying to process my feelings while writing “Inner Harsh.” Beside the writing, things were not going well for me.
I remember that year, I decided to get my free meal at Applebee’s for Veterans Day. I was hesitant. Who was I to get this free meal worth $15? I was trash. I didn’t want to sit around and tell war stories. I didn’t want to talk about what I was up to after the military. In my mind, nobody understood what I was going through. Hell, I didn’t even know what I was going through. All I knew is that I felt like crap, and when somebody told me, “Thank you for your service,” I didn’t know how to feel or talk about it, because I never had developed the vocabulary. My service and my pain were both unspeakable.
I just wanted to show my VA card at Applebee’s, get my damn steak, eat, and continue on with my miserable existence.
I didn’t go in for my free meal for the next 4 years or so. I didn’t know what to do with my identity as a veteran. I pushed it down. Kept it out of sight. It wasn’t out of shame or anger, I just didn’t know what to do with it. I left the country and lived in China for a few years. I pretended like my identity as a Marine and a veteran didn’t matter. Of course, when I look back, I was hilariously wrong. I wrote a few plays for an expat community theater in Suzhou, China, and the picture I submitted for the profile board was the one below, of me going off the rails in 2013 – 2014 in my parents’ basement, holding an AK-47.
It was the only picture of strength that I had that wasn’t from my time in the military. I hadn’t entirely succeeded in suppressing my veteran identity. I wanted to show it, but I didn’t want to show it. I was determined to not “just” be a veteran. I couldn’t post a picture of me in uniform, because only a jackass that couldn’t move on with his life would do such a thing. Or that’s what I told myself.
Of course, this had all been bullshit. I was suppressing a major component of my identity. I wouldn’t be forced to confront that identity again until I re-entered the United States, settled down in Rhode Island with my girlfriend, and eventually started grad school. The vet coordinators at URI at the time, Jeff Johnson and Rachael Garcia, invited me to enroll in a class exclusively for transitioning vets. I needed the credits to take full advantage of my remaining GI Bill. That was the practical reason. More importantly, the class forced me to sit in a room with several other veterans, something that I had never done before. It was an uncomfortable yet positive experience that forced me to to think about being a veteran outside of a passing “Thank you for your service” on Veterans Day.
That following summer of 2019, I met Dr. Mark Santow, director of the Providence Clemente Veterans’ Initiative, while I was tabling for a Frequency Writers event in Providence. I had been running an expressive writing workshop, Endless Beautiful, for a few years at that point and had done some work with Frequency. During our first conversation, Mark said that he was working with veterans. Because of my contact with other veterans at URI earlier that year, and the resulting exploration of my veteran identity, I told him that I was a vet. I spent the next few semesters in PCVI exploring my veteran identity even more.
I became more comfortable talking about and claiming my identity as a veteran. I eventually decided to create a writing program for vets as part of PCVI. I designed and taught the PCVI Summer Writing Seminar in the summer of 2020. The class went well, despite COVID-19, and I look forward to running the next version in the summer of 2021.
So what are my feelings about Veterans Day now that I’ve been a veteran for several years?
We must push against and destroy abstract totems and untouchable sacred cows that our culture has built around military service. Military veterans in this country, however noble our sacrifice, whether we are dead in the ground or living our lives as Americans, were and are people with hopes and dreams, troubles, and successes—just like any other American.
Yes, honor us. Military service is an incredibly important act. But don’t limit our range as human beings. Don’t demand that we continue to tell the same stories and trap us. It’s okay to thank a servicemember for their service, but ask us and genuinely care about how we are today, as a person, not a caricature of a person. Acknowledge us as full human beings with kids, hobbies, and anxieties.
Listen and learn from us.
I’m not going to lie, opening these lines of communication might be awkward as hell, but we have some work to do in this country. In a study of U.S. Military veterans seeking mental health treatment from 2011 – 2012, researchers found that serving in a war zone meant 43% decreased odds of suicidal ideation, compared to those that did not serve in a war zone (Blosnich et al., 2019). This finding surprised the hell out of me. It flies in the face of the dominant narrative about our veterans. Our culture assumes that if someone has been to a war zone, they must be terribly damaged “I could never imagine what you’ve gone through” or if they were not terribly damaged, then they are somehow not veteran enough. They don’t rate. Just like I didn’t rate after getting out of boot camp.
I felt that inadequacy again after getting out of active duty. Yes, I was a USMC infantryman, but I never got into any intense firefights during my deployments. I had friends that were torn up by machine guns, had limbs blown off, messed up by IEDs—I was in crazy situations that happen when you’re driving around Iraq during a war, but no gunfights, which meant no coveted Combat Action Ribbon (CAR) for an infantryman. When I finished my deployments and got out of active duty in 2009, I felt that I hadn’t done my job.
And it felt like shit.
But how could I possible feel that way if I hadn’t lost a limb or had buddies die in my arms? You know, the unspeakable horrors of war that I, and every other American, had seen in countless war movies, but never spoke about openly. I didn’t have an open conversation about my service for 10 years after getting out, because everyone was afraid of “reopening wounds.” We’ve built a system that assigns trauma as currency with our veterans and gives them few options to move out from that trauma.
By just throwing a flag up, or saying “Praise the Troops” and running a parade, and not digging any deeper to understand the people that have and are carrying out military service, our country is saying, “We honor you, we are in unspeakable awe of your military service, and for that, you must carry this unspeakable burden. If you haven’t experienced trauma, then you better find some, because we need you to carry that and let it torture your soul, otherwise this whole thing doesn’t work.”
How do we break this down as a society? We develop a space for open discourse. We demystify military service and our veterans. We encourage veterans to tell stories, not just war stories, but stories as human beings, as Americans, as neighbors, sisters, mothers, brothers. We give them a path forward and don’t shackle them to their past or to a narrative that is dictated by the dominant culture—give them a voice, authentic representation, and not just on Veterans Day, but every day.
That is how we can honor military service.
Follow this link to donate to the Providence Clemente Veterans’ Initiative.
Blosnich, John R., PhD, MPH, Brenner, Lisa A., PhD, & Bossarte, Robert M., PhD. (2016). Population mental health among US military veterans: Results of the Veterans Health Module of the Behavioral Risk Factor Surveillance System, 2011-2012. Annals of Epidemiology, 26(8), 592-596.
I recently competed in an event at the University of Rhode Island called Immerse-A-Thon. The event was a competition centered around designing innovative VR/AR applications. This article is a detailed telling of how I became involved with the event, what the stages of my app prototype development looked like, behind the scenes takeaways from mentor meetings that happed during the event, challenges encountered, solutions that I came up with, and a narrative of the competition from my perspective. I’ve included images and videos captured along the way. I’m learning a ton by reflecting on the experience through this writing. If anything, this should be an interesting look at what led up to the event, what happened during the event, and what the result was. Let us begin!
At the end of my Spring ’20 semester at URI in the Master’s of Adult Education Program, I became exponentially interested in the possibilities of XR (AR/VR) in the educational space. Looking back, there were a few elements that seeded my interest. COVID-19 had completed disrupted the way I was teaching classes for the community-based Adult Ed program that I work for. I was now teaching ESL exclusively online, and I was beginning to understand the strengths of online teaching and the opportunities that it presented. Additionally, through my studies, I was introduced to a corporate trainer from the local company AAA, and the tools that the trainer used to create their educational materials. One of the tools shown to us was Adobe Captivate, which basically allows you to create dialogue scenarios and quizzes in a nice-looking package. Finally, as a gamer, I had looked into VR again, and I was blown away by the advancements that had taken place over the past few years.
Fast forward a few months.
I messed around with Adobe Captivate for a bit, and it felt limited. I had some experience plunking around with development engines like Unity 3D and Unreal Engine, which basically allow you to create anything…as long as you know how to program it to do what you want. As cool as Captivate was, I wanted to do more than create dialogue branches and quizzes. As I became more interested in XR and experimented with an Oculus Rift-S, it became apparent to me that VR was going represent an absolute sea change in education once the tech become more easily available and developers got better at making educational content. I was determined to be a part of that sea change. I started looking people up at URI that were involved with VR.
The first person that I found was Dr. Bin Li, an Assistant Professor in the Department of Electrical, Computer and Biomedical Engineering at URI. Along with doing a bunch of really cool stuff around XR over wireless networks, Dr. Bin Li was the co-organizer of an event called the Immerse-a-thon at URI. The event was a competition centered around designing innovative VR/AR applications. There were mentors, a pitch element, and a $1000 prize for the winning team. During the meeting, Dr. Bin Li explained the power and potential of AR (Augmented Reality) to me. I registered for Immerse-A-Thon right away, and per Dr. Bin Li’s guidance, I began learning some more about what AR was, and also reengaged with my Unity 3D development skills.
Through a combination of Udemy classes and YouTube videos (I’ve linked a few of the most helpful), I was able to at least navigate the Unity interface and write extremely basic code to get things interacting with each other. As I mentioned previously, I had dabbled with game development over the past several years in both Unity and Unreal. When I had tried to learn the tools in the past, I would make something based on a tutorial, but I just never seemed to get over the “hump” of making something entirely mine, with my own thought processes. I had a decent understanding of navigating the programs, but things quickly fell apart when it came to the programming. At some point, other priorities would come up in my life and I’d tell myself, “This is just a game anyway. It doesn’t really matter that much.” And I’d fall off game development for 7 months to a year, only to start off at maybe “Square 1.5” the next time.
Developing apps for VR felt different for some reason. There was something magical about creating an environment and objects in VR and then putting my Oculus Rift-S on and being able to walk around that world and interact with objects. Plus, this time around, I was determined to attach my development interests to my professional trajectory as an educator. If I could put those two things together, I knew I’d open up a world of possibilities. I’m a builder. If I could harness the power of these tools, I could build things that would truly take advantage of my capabilities and make a large impact.
After a few months of prep, the first orientation day for Immerse-A-Thon happened on September 13, via Zoom. Organizers presented on what to expect during the event. Computer Science lecturer, Indrani Mandal, and PhD Student, Jiangong Chen, gave an introductory demonstration on how to get set up with Unity and begin working with the Google Cardboard SDK (Software Development Kit). My development in Unity for the previous few months had focused on using a PC-based VR solution such as the Oculus Rift, which offers 6DoF (Degrees of Freedom), meaning you can walk around, you have motion controllers and your hands to interact with things, etc. Google Cardboard is 3DoF, meaning your can look around 360 degrees. You don’t have motion controllers. You interact with objects by looking at them and hitting a button on the cardboard or your phone screen. This was a completely different way to think about VR.
With that presentation, all of the VR interaction and movement development techniques that I had been experimenting with and learning about just went out the window. It wasn’t a requirement for me to create something that used Google Cardboard, but my goal was to create something that students in a community-based ESL program could use. Upon further thought, I realized that organizations hosting the community-based programming would not realistically have access to high-end PC VR gear. Google Cardboard made sense. Almost all of my students had smart phones that could run it. It presented major limitations versus more robust VR systems, but those limitations also got me thinking about using the core tech differently.
The final day of the Immerse-A-Thon was to happen in 2 weeks, September 26. Before the final pitch event, I had to figure out what I was going to do, form a team (I needed at least one other person in my team), and generate something that was interesting and valuable to pitch to a panel of judges.
I began playing around with the demo scene that Indrani and Jiangong had shown us during the orientation day event. It was a basic scene that had a floating cube. If you looked at the cube and selected it, you could transform it into another shape. The demo scene provided everything that I needed in terms of camera movement and an event system for interacting with objects. If you attached an Event Trigger component to an object, Unity could take in some sort of instruction to do something. This is where my practice over the previous few months came in handy. I wasn’t great at programming yet, but I had messed around with Event Triggers and basic code. I also knew where to look for components when I needed them. I created a new scene, created a plane to stand on, and dragged in a few free, cutesy farm animals from the Unity Asset Store. My initial concept was to create a tool for teachers to load in objects, assign names and sounds to those objects, and allow students to use the app to help them study vocabulary.
I was able to get something up and running relatively quickly. When you looked at an object, such as a pig, a title would pop up that said “Pig.” When you looked at a cow, it said “Cow.” Basic stuff. I knew how to add sound from previous experimentation, so I planned on recording myself saying the names of the objects. I still didn’t have a team, which was a problem. I put out a call on the Immerse-A-Thon Slack about my project and aims, but didn’t get anything back. It was still relatively early in the competition.
As I learned more about creating interactions with objects in Unity, I had an important epiphany. As much tech as I had at my disposal, including a PC and a PC-VR headset, I didn’t have a 10 dollar Google Cardboard! All participants of Immerse-A-Thon were receiving a cardboard viewer for our phones, but those wouldn’t be mailed out until a week prior to the main event. Although a Google Cardboard Viewer wasn’t the most difficult thing to find on the internet, it was still a highly niche item. You couldn’t just go down to the gas station around the corner and grab yourself a Google Cardboard Viewer.
This is where my instincts as an experienced educator in generally low-resource and dynamic environments kicked in. If I didn’t have this stuff, how could I expect a school or community-based organization to provide Google Cardboard hardware? As cheap as a viewer was, it represented a friction point, and honestly, I doubted that VR with 3DoF was nearly as immersive as their 6DoF counterparts. Was it worth jumping through that hardware hoop to create an effective educational app? I seriously doubted it. From my experimentation with VR, holding something up to your face, that let light in and didn’t allow you to move around or interact naturally wasn’t going to produce enough immersion to warrant the hassle. That’s when I remembered something called Magic Window.
Magic Window is a Google technology that uses the gyroscope of a cell phone to look around just like VR—except you don’t put the phone on your face. For VR, you have 2 cameras, one for each eye, to create depth, in Magic Window, you have 1 camera and the user can also drag their finger on the phone screen to rotate the camera. No extra hardware is required. After looking at resources for developers provided by Google, I switched the app to work with the gyroscope and not use the 2 camera setup of Cardboard. I exported the demo to my phone. After testing it felt like the right move.
But…I still didn’t know what the heck I was going to do about a team. Who was I going to get to join my team? What if my teammates had their heart set on creating a VR or AR application? After all, the event was about VR/AR!
Just as I was deep in my thought processes about team formation, Carolyn came home from her field work that day. For a little over a year, she’d been intensely involved with studying terrapin turtles. I’d gone out to the field with her a few times to attach radio tags to baby turtles the size of Oreo cookies and observed where the little radio-tagged explorers went after they were released. I’m not averse to nature, I enjoy the outdoors, but I’m also somebody that loves working on computers and playing video games. Carolyn has gotten me outdoors much more than what would have otherwise happened over the past 4 years. She’s also taught me a lot about animals and plants and will frequently quiz me on bird or tree types when we encounter them on our hikes.
“Would you like to help me prototype an educational nature app for this Immerse-A-Thon event that I’m involved with?” I asked Carolyn as she put her field equipment down in our kitchen. She laughed. This type of question was a familiar setup for us already. I’d come up with some idea, plot it out in my mind, and then realize I wanted and needed her help to pull it off. Carolyn told me she didn’t know anything about app development; which I assured her I would handle. I told Carolyn that her role would be to provide meaningful and correct scientific information about animals. I could tell she liked this idea.
A little while later, while Carolyn was on the phone with her mom, she pulled her phone away from her ear and said, “Okay, I’ll do this, but on one condition…we need to win.” I smiled and went back into the studio to continue my work.
We now had a team that could compete in the competition. That night, we came up with a good team name “Tuxedo Cat.” Our cat Bella always features prominently in our projects!
I knew the pig and the cow in our current demo weren’t going to cut it. I needed to find a bigger set of animal models, and hopefully something a bit more interesting. I took a look in the Unity Asset Store for animals and searched through their free assets. There were some dogs and cats, fish and other marine animals, and also a pack called “Living Birds,” which featured several good-looking models of common North American birds complete with animations and even bird songs! Perfect! We had a week to go before the final weekend, and I had to figure out how to put these bird models to good use.
I ditched the cow and the pig from my demo scene and kept the platform directly in front of the camera. I then brought in a crow model and placed it on the platform. I created a script with a few functions that I could call which meant that when you looked at the bird, it presented info, and I set it up to play a bird call when you selected the bird. Carolyn jumped into her field biologist role, and made a spreadsheet that contained bird information for all of our birds. This was super valuable to me, because she knew exactly where to find this information and how to determine whether it was accurate. As that final week went on, I created an array of birds, and an interactable cube which allowed you to cycle through the birds. If you selected the cube, it would move the array forward, hide the previous bird, and activate the next bird. For example, the Crow was replaced by a Song Sparrow. This functionality was cool, but it didn’t really bring anything special to the table. I needed the app to do more.
Carolyn and I brainstormed. What else was out there for apps in this realm of bird identification? There were apps and websites which taught you what birds looked like and even played their calls. We became discouraged. Was what we were creating really a new idea? It didn’t look like it. We were seriously considering changing the app to feature fish or other animals. Maybe we could create a fish identification app for divers? Then I remembered an experience during an event put on by the Rhode Island Natural History Survey called BioBlitz that happens annually in Rhode Island.
During BioBlitz, a bunch of scientists and citizen-scientists go out and identify as many animals and plants in an area within 24 hours as they can. It’s an amazing event that allows you to learn about nature in our state and learn about a physical location in an extremely intimate way. I had done 3 BioBlitzes with Carolyn, and I remembered being taught bird calls one time at Snake Den State Park by birders. They had this great way of explaining bird calls. They would use mnemonics to describe the bird calls and help you remember them. For instance, an American Goldfinch’s call can sound like “PO – TATO – CHIP.”
What if in our app we split up the bird calls, and presented them with the mnemonic and really focused on teaching the mnemonic with the birdcall? It seemed like it might just work. I got to work, splitting up the calls, and building in functionality which allowed me to slot the call pieces in, with an appropriate mnemonic piece, and adjust the timing of it all. Carolyn began producing mnemonics for the birds and worked with me to determine where the calls should be split up. We also decided on a name for the app, “Birdsong Explorer.” We were closing in on the main competition start on September 26.
Even though we weren’t using VR for our app, I still wanted to make the experience immersive using Magic Window. Personally, I find this tech to be really cool and accessible. I wanted the learning environment to be memorable. I swapped out the platform that the birds were sitting on with a free tree stump asset, put a ground texture down on the plane, brought in some free, low-poly trees to put around the player, and set up panels with free forest pictures from Pixabay to create an immersive forest feeling.
It’s amazing what you can do with a few well placed pictures. I used the same image around the sides, which were 2D, but had a number of 3D trees between the player and the panels. This created a great sense of depth. The canopy above was nice and thick, but I also pulled a few pictures from Pixabay for sky panels above the player. In a more advanced 3D game, these assets might actually be 3D models, be lit, and cast shadows, which require a lot of computing resources. On mobile, you have to get creative. But it really isn’t that hard to create a convincing environment when the character is stationary, and also, they are meant to focus on a singular object, which in this case is the bird, in front of them. By the the morning of the first competition day, we had a prototype that you can see in the video directly below:
Carolyn was out in the field tagging terrapins during the Immerse-A-Thon launch briefing on September 26. I paid special attention to the judging rubric portion of the presentation:
Design Ideation – 25%
Problem/Solution Fit – 20%
Originality – 20%
Scalability – 20%
Implementation – 15%
Our working prototype would account for only 15% of our score! Furthermore, judges would not be using the app themselves. That meant that my Unity development skills and the work that we had put in to the prototype didn’t mean this was going to be a slam dunk. It wouldn’t be a showdown between developers. If the other teams came up with great ideas and presented on them well, they would absolutely have a chance to win this thing.
As an infantryman in the Marines for several years and an adult educator in environments where conditions can change very quickly, I have learned how to pivot decisively and forcefully. Adapt and overcome or die. It might sound harsh, but that was the school that I was brought up in. I made a snap evaluation of what resources would be most useful to me in that moment. And what was the best way to move forward. I was done working on the prototype. We nailed the 15% implementation portion, now we needed to come up with the other 85%.
As part of the Immerse-A-Thon, 17 different mentors from a wide range of disciplines were available to meet with groups for 30 minute sessions. I needed good ideas and guidance regarding how our app was going to best meet the criteria of the judging rubric. It was clear that the mentors would be an extraordinary asset. After the mentors introduced themselves during the morning session, I asked if I could make more than 1 mentorship appointment. I was given the green light and proceeded to make appointments with 4 mentors. I probably could have done one or two more, but I also didn’t want to break my brain ahead of the pitch competition.
The first mentor that I met with was Jiangong Chen. He was the PhD student that had first demonstrated how to get Google Cardboard setup in Unity. I had a few questions I wanted to bounce off of him regarding the technical aspects of the Birdsong Explorer prototype. I wasn’t going to make any changes to the prototype at that point, but Immerse-A-Thon was proving to be an amazing learning experience for me in terms of app development, and I wanted to follow through on some lingering questions. I got some answers during the session, but most importantly, when I asked Jiangong about the gyroscope, we began talking about the sensors that come with most modern cell phones—things like the camera, microphone, accelerometer, GPS, etc. This got me thinking about other functionality that could be produced by utilizing those sensors.
Carolyn came back from the field and joined me for the second mentorship appointment that afternoon. Our mentor was Siu-Li Khoe. She was the Executive Director of Rhode Island Virtual Reality (RIVR). She had also been involved with the development of several international start-ups. I knew that Siu-Li was someone that I wanted to introduce myself to in terms of getting involved with XR in the New England. Once Carolyn explained what we were trying to accomplish, Siu-Li mentioned a plant identification app that she’d been loving lately called PictureThis. In this app, you basically take a picture of a plant, and the app will identify the plant. Interesting. Siu-Li began helping us think about who would use our app. We imagined it could be for people that were interested in birds, but needed something more accessible than a field guide.
Our next mentoring meeting was with Chris Jarvis, a bestselling author and entrepreneurial strategist. When Chris had introduced himself that morning, he said he would disrupt the thinking of any team that met with him. I believed him, and I was on the hunt for good ideas. So I was eager to hear what his take on Birdsong Explorer would be. When we told Chris that our target was people that were kind of interested in birds, he laughed. Chris started asking questions and began challenging us to drill deeper. By the end of the conversation we ended up an app concept that used a collection experience like Pokémon Go and an educational experience like the Discovery Channel. He disrupted our thinking alright! We had a much more compelling pathway as a result.
Finally, I met with Indrani Mandal. She had done the Unity setup demonstration during the orientation. She also had expertise in education and creating educational apps. As an educator, I knew that the app needed to be tightened up to maximize learning. I had actually experimented with implementing a quiz with our prototype the previous week (2 nights worth of work!), but I had ultimately shelved that element because I had encountered problems and didn’t know how to make it work seamlessly with our main portion of the app. Indrani suggested that I take a note from Khan Academy Kids, which has extremely tight loops in terms of short lessons and short formative assessment quizzes. I wouldn’t be able to implement this into our prototype, but I knew exactly how I could present it.
After the mentoring sessions, Carolyn and I ate dinner and got straight to work on our presentation. We knew the presentation would make or break us. We looked at the rubric and organized our PowerPoint slides to address the headings. This is where Carolyn really stepped up for the team. She needed to make presentations frequently for her work at the Graduate Writing Center at URI. She got a great looking presentation template up and running quickly and hashed out a format. I brought in bits of information about the app and created any mock-up images that we needed for the presentation in Photoshop. By the time we reached midnight on Saturday, we agreed that the presentation was coming along. It still needed to be tightened up a bit, but we also had a 20-minute pitch coaching session scheduled for 9 am on Sunday morning. We needed to get some sleep!
Our pitch coaching session was with Nancy Forster-Holt, an Assistant Professor of Entrepreneurship at URI, Josh Daly, the director of the southern region for the RI Small business Development Center at URI, and entrepreneurial strategist and one of our mentors from Saturday, Chris Jarvis. During the session, we would get 5 minutes to deliver our pitch. The other 15 minutes would be dedicated to feedback and adjustments. Carolyn and I had began practicing our pitch the previous night to identify areas that needed work and tighten it down to 5 minutes. We ran through it a few times before entering the call with the pitch coaches. Our presentation wasn’t perfect, but it was good enough to receive feedback on.
We delivered the pitch, and feedback was pretty good. Chris said he was impressed by how much we had pivoted from our previous idea. When we pitched to judges later that afternoon, we would have 5 minutes to deliver the pitch and 3 minutes for a Q&A session. Now that our app included features that required photo and sound identification by the app to verify collection of those items, I was worried that I didn’t know enough about the tech and how to make that happen. The coaches told us to point to apps that were already using similar tech like PictureThis for photos of plants and Shazam for songs. We didn’t need to explain how the sausage was made, we just needed to show that it was possible and that other apps were doing it. Just like using examples of Pokémon Go and Discovery Channel, it was important to point to successful and proven use cases.
Carolyn and I left the session feeling pretty good about the presentation. I wanted to look up the tech behind PictureThis and Shazam to understand it better and use terminology related to it. I also wanted to create a mock-up image for the formative assessment portion. I can talk about formative assessments all day, that’s really in my court, so I wanted relevant imagery that I could refer to. After some searching online, I discovered PictureThis and Shazam use machine learning algorithms to produce their results. Computers look for “fingerprints” in pictures and audio files and “learn” to identify different objects or songs based on a set of criteria. The formative assessment mock-up was easy. I thought of the tight loop that Indrani had mentioned and created an image of what I thought that would look like.
We got the slides we wanted in, trimmed what we didn’t need out, and practiced our pitch a few more times. We only had a few more hours before the pitch competition began. Carolyn and I decided to go for a walk to help clear our heads a bit. During the walk we tried to anticipate questions that the judges would ask us. One of those questions was, how would this thing actually make money? Carolyn and I had never been great about asking or answering that question, and it was one of the top things that the mentors kept bringing up.
They’d ask questions like: Who is your customer? How much will this cost?
We decided that the base version of Birdsong Explorer would be free. I then took a page from one of my favorite free-to-play games, Rocket League and its paid-tier, Rocket Pass. In Rocket League, the base game is free. You pay $10 to get access to 3 months of special challenges and cosmetic items for your car. As a player, I have no problem shelling out that $10, because it makes the game way more fun. For Birdsong Explorer we’d provide the base app for free on the app store. Users could then pay $5 for 3 months of special content including extra birds and challenges.
We got back from our walk just in time to practice our pitch a few times. On our last practice run, we finished with 10 seconds to spare. We were as ready as were ever going to be. This was it. We logged into the livestream of the event. The host introduced the judges, all of whom were extremely accomplished! Their expertise ranged from running multi-million dollar companies, video game development, project development for NASA, and other engineering endeavors. After the judges were introduced, Carolyn and I looked at each other, we both agreed we were a bit nervous. That was a good sign though. We were excited, had worked very hard, and needed to prove to these knowledgeable thought-leaders that our app was the best at this event. We waited eagerly for the host to call us up to present. Before long, our time finally came. See the video below for our pitch!
We nailed the presentation and the judges seemed to enjoy what we delivered. During the Q&A session, we had answers readily available for all of their questions. The other teams competing in the Immerse-A-Thon had great ideas, but I felt our presentation was a bit tighter than the competition. We also had a working prototype, which even though it counted for 15%, still counted for something, and seemed to engage the judges more. Hopefully, the judges felt the same way.
After the teams finished presenting, the keynote speaker for the event, Jonathan Flesher, an Emmy award winning producer of VR animation and who has a ton of other accomplishments, was interviewed by…Chris Jarvis. Chris was clearly a force to be reckoned with. Jonathan Flesher shared a lot of insight about where he thought XR technology was going in the near to mid-term future and also tips for entrepreneurs getting into the space. After he finished speaking, it was time for the winning team of Immerse-A-Thon 2020 to be announced. See the video below for the big reveal!
Carolyn and I each won $500. I gained a huge performance accomplishment regarding app development. It was an amazing experience that if you’ve read this far (thank you by the way) was complex with many twists and turns. As I’m writing this, it’s one week after the event. Our inboxes have been flooded with congratulations and just amazing staff from URI looking to support us in our endeavors. Chris Jarvis has agreed to mentor us. We met with him a few days ago, and we have other meetings coming up. I don’t know what is on the horizon, but I am so glad that I began searching about VR this past summer and reached out to Dr. Bin Li!
This past Wednesday was the final meeting of the PCVI Summer Writing Seminar ’20 class. The dust is settling, and I’ve had the opportunity to reflect on what happened in terms of instruction. The good news is, I can say the class was a success!
We ended up with 5 excellent, experience-based short stories. As far as the instructional content goes, I’d rate it 7.5 out of 10. I want to trim some of the readings out and focus more intensely on the ones that resonated with students. I firmly believe we wouldn’t have ended up with the stories that were written without examining the included topics in the intense way we covered them (intense in the context that this is a 5-week community-based writing program for non-writers).
As expected, this course surprised more than a few of its participants in terms of workload. Almost all of the 15 registered students knocked out a good “I Am a Camera” story. I was pleased with this. When we switched gears in week 2, and we started working on the experience-based short story, that’s when 10 of the 15 dropped off. If we wouldn’t have ended up with the stories that we did at the end of the course, I would say I need to dial the workload back a bit next time, but we ended up with quality output at the finish line.
I need to make it much more clear what will be expected early on during the recruiting phase. I want to interview students before I bring them on next time. This isn’t a knock on students that dropped after those first few weeks. I’ve come to realize there’s a certain type of programming and expectation for community-based writing programs, especially in the veteran community. Many of these writing programs are focused on the therapeutic benefits of writing and catharsis. Those programs are expressive in nature. The PCVI Summer Writing Seminar is specifically focused on creative writing skills. I had more than a few students enter the class equating creative writing with mental health and that’s why they were there. Mental health professionals were referring them to the program for mental health gains.
I’m not against programming that is focused on expressive writing and catharsis. I mean, that’s what Endless Beautiful has evolved into in many ways! We hosted a few May is Mental Health Month events in 2020, and I’ve witnessed extraordinary moments of genuine healing and sharing during EB workshops. That being said, I would say that not pursuing skill-focused creative writing programming for veterans is a missed opportunity on many fronts.
The first reason skill-focused creative writing instruction for vets is important has to do with performance accomplishments and self-efficacy.
Here’s Cambridge Dictionary’s definition of self-efficacy:
Perceived self-efficacy refers to people’s beliefs about their capabilities to exercise control over their own activities.
The psychologist Albert Bandura presented the theory of self-efficacy in the late 70s and it has been researched and adapted considerably since. Self-efficacy is not only a critical concept in psychology. It has major ramifications in the world of education. Self-efficacy comes down to an individual’s confidence in performing a specific task. There are many factors that play into this, but a key takeaway here is that the individual’s confidence in performing that specific task might determine whether or not they are even capable of performing the task at all.
The experience of performing that task successfully or not performing it will inform self-efficacy regarding other tasks. It generates a strong feedback loop that can be positive or negative. For example, if you’ve never had success swimming, or performing other sports, and others around you are having a difficult time swimming and telling you not to waste your time, it will be extremely difficult for you to learn how to swim. On the other hand, if someone teaches you how to hold your breath under water first and you master that, then how to doggy paddle, and you master that, and then how to do a breast stroke, and others around you are encouraging you and doing breast strokes, your chances of learning how to swim increase greatly.
I will not dive deep(er) here, but students that are taught to write stories in a comprehensive way and finish those stories will have achieved a major performance accomplishment. Expressive writing exercises don’t necessarily have that “container,” and are often times attached to one’s mental health state. You’re never finished, so you do not get the same self-efficacy boosts as specific skill-based instruction. Skill-based instruction oriented toward performance accomplishments does not exclude emotional growth. In fact, emotional states are a huge part of the process. I would argue there is potential for much longer lasting gains for veterans when you do not stop at perceived gains in emotional state and aim for a very specific goal and related performance accomplishment, i.e. finishing an experience-based story after several drafts.
Another important aspect here is the benefits of skill-based instruction and how it generates more meaningful opportunities to reach the public. Well-written stories by a multitude of veterans can go much further in bridging the veteran/civilian divide. Because of the Summer Writing Seminar, I now have a mechanism to train vets to write these stories, and we can share them with the public via our upcoming website, PVDVETS.org. A purely expressive piece of writing might be extremely meaningful to its author but not have the same impact on the public. The students that finished the stories in our class have reported to me that they are motivated to showcase their work to the public and are excited about an opportunity to create understanding between veterans and civilians.
I’ll finish this post with some words that I shared with students in this year’s Summer Writing Seminar after they read their stories on Wednesday. I’m extremely proud of what has taken place already and we’re just getting started!
As I sat down to write this post, I pulled up YouTube and was presented with a curious thumbnail of a video titled “Starship SN5 150m Hop.” The video was under a week old and already had nearly 5 million views. It had been posted on the SpaceX YouTube channel. It’s a minute-long video of the Starship SN5—which looks more like a shiny, silver grain silo than a starship to my non-rocket scientist eyes—launch and unhurriedly ascend maybe a few thousand feet, all the while moving to the side in a composed manner, and then slowly descend onto a landing pad 150m away. I might laugh at my Neanderthalian interpretation of this propulsion video when I read this blog post from a starship in twenty years, but hot-damn, in 2020…oohgaabooga. Watch the video for yourself:
In other, entirely unrelated news, Carolyn and I went for a dive at one of our favorite local dive spots, Fort Wetherill in Jamestown, RI, yesterday. The weather was magnificent. The slim, sandy beach was filled with sunbathing and picnicking families in canvas chairs. Skinny teens were jumping off the 60-foot, granite cliffs on the west side of the cove. If those kids could see the leg-shattering rocks that we see under the surface, they might not be so eager to launch themselves over the edge. There were maybe 4 groups of divers out on the water. We’re an easy bunch to spot, because we tow a big red dive flag around with us so boats and other watercraft know to stay clear.
Visibility under the surface is never great at Fort Wetherill. It averages about 5 – 10 ft. Despite the limited visibility, we consistently spot wildlife like fiddler and horseshoe crabs, all manner of fish, small jellyfish, and skates (skates look like stingrays).
It is strongly recommended that you dive with a buddy. If something goes wrong when you’re 30 feet below the surface, like maybe you suddenly don’t have air to breathe, you want a buddy nearby that can quickly help you get sorted with your own equipment or provide their backup regulator. Scuba diving is a cautious endeavor with many redundancies and safety checks built in, because well, it’s dangerous as hell. Humans didn’t exactly come with stock anatomy that allows them to spend 30+ minutes under 20+ feet of water. The equipment that we wear is meant to keep us alive. If that equipment fails, that might be it, you could simply die. Because of the stakes here, scuba equipment, at least the stuff that dive shop professionals will recommend, tends to be of very high quality. It’s not cheap, but it will keep you alive, and it usually doesn’t fail you when you need it.
Enough of me being old grandpappy safety man. I mention all that safety stuff because a big part of diving is keeping track of your dive buddy. If you get separated for more than a few minutes, you need to return to the surface. If you meet each other at the surface, you can both feel a little dumb and go back down; but here’s another scenario: what if your dive buddy lost consciousness and is at the bottom? The air in those tanks doesn’t last forever people. At Fort Wetherill, you have 7 ft of visibility and can’t call out to that person underwater. I’m not trying to scare anyone out of scuba diving, but keeping track of your buddy is super-important.
Check out my non-sexy visibility demonstration video below:
The first portion of this video is a shot of me at maybe 10 – 15 feet of depth. You can tell this because the color of the plants and coral is all visible. The second half of the video is a shot of me at more like 20 – 25 feet of depth. The deeper you go, the more difficult it is to see.
Fort Wetherill is our “home” diving location. This is where I learned to dive, and we come here most frequently. Getting into the water is incredibly easy. You put your gear on in the parking lot, walk down the boat ramp, and boom, you’re swimming under the surface of the ocean. I value the location for many reasons, but I believe one of the most important ones is that our dives challenge us in terms of visibility, keeping track of our buddy, and navigation. We navigate with a compass underwater. There is no GPS available under the surface.
Carolyn and I dream of going on a trip to a tropical place where the water commonly has 100 ft of visibility, or what is called viz. We plan on getting our next diving certification before that happens. This will allow us to go deeper. Our current Open Water Diver Certification through PADI allows us to go to 60 ft of depth. Advanced Open Water Diver Certification would allow us to extend that depth to 100 ft and involves much more advanced navigation skills.
Navigation skills are always useful, but you really need them when you’re in deep water. Depending on the clarity of the water you’re in, sunlight might not be able to penetrate to over 50 ft. Even 50 ft in a place like Wetherill is a dark and eerie experience. Some people go in for that type of thing. I’m more interested in the deeper depths in tropical water scenarios. You can tack on extra gas mixture certifications which allow you to stay underwater for longer, safety diver, and there’s even an underwater photography certification. I think we will pursue our Advanced Open Water Diver Certification sometime during the summer of 2021.
Despite only having a max depth of 60 ft (trust me, 50 – 60 ft feels pretty damn deep), Carolyn and I have been in situations where we’ve had to navigate by our instruments alone. A dive we did last summer in Lake Wazee in Wisconsin comes to mind. This lake has viz averages of 30 – 40 ft and a max depth of 355 ft. I have a clear mental picture of us swimming at about 30 ft, Carolyn 7 ft ahead of me, and being flanked above and below, seemingly endlessly, by the exact same shade of blue. We could have flipped entirely upside-down and the only thing that would have indicated what was up would have been our bubbles making their way to the surface. In this scenario, it is critical that you’re watching your computer for the depth and your compass to make sure you’re swimming in a straight line in the direction you want to go. There’s nothing to orient yourself to. We ended up swimming straight out and doing a 90 degree turn to our left, landing us on some fish cribs chock full of big walleye. Successfully navigating with only our instruments on that particular dive was an extremely satisfying experience.
Many times, the stuff that is the most interesting is within 5 – 10 ft of water. We got a great look at a flounder on our dive yesterday. Usually, we encounter these guys in deeper water, and they dart off before we get a good look. This was the first time I was able to observe just how graceful these creatures are when they swim.
If you’re a fan of Disney movies, you are probably familiar with this Flounder.
Yeah, the Disney version is definitely not a flounder! Flounders lay flat on the bottom like a ray. You can get a closeup view of a real flounder in the wild about halfway through the short video from our dive below.
I’ve been on a spirited virtual reality kick for the past 3 months. My VR foray began maybe a year and a half ago with the purchase of a Playstation VR. I found some of the games to be interesting. Carolyn and I had fun with titles like Beat Saber and Super Hot. The system exhibited limitations, especially when it came to the motion controllers which were introduced during the PS3 era. The whole PSVR system is actually an amazing repurposing of discarded proprietary tech from Sony. Shortcomings revealed themselves in terms of tracking. I wasn’t a fan of movement in the worlds either. The motion controllers are joystick-less wands with buttons on them.
I was determined to get a bit more out of my PSVR before I trade it in toward a PS5 later this year, so I looked up a list of premier PSVR games and gave one of them a go. I landed on Blood and Truth: a first person action espionage game that offered visuals and mechanics well beyond what I had initially experienced on the PSVR. I was impressed. And then the motion controllers wouldn’t track. And I’d end of up being pissed. Despite these shortcomings, Blood and Truth opened my eyes to the fact that VR development had been quietly churning along, at least outside my purview, and improving vastly over the past few years.
As much as I happily drop money on awesome tech (I’m not rich people, but I’ll easily make calculations in my mind on how skipping out on meals can pay toward shiny toys, but I digress) I wasn’t going to spend $1200 on a Valve Index. When I was shopping around for VR systems, the Index was the best in terms of fidelity, but it required base stations for tracking, and I wasn’t convinced on how invested I really was in VR.
I settled on the Oculus Rift-S, which was a VR system that could plug into my gaming laptop, had decent fidelity, used inside-out tracking and didn’t require base stations. It also cost $400. I could tolerate the price tag, and it seemed like one of the more convenient options for an “all in” VR setup. It was difficult to find one to purchase online. They were sold out everywhere: a testament to the growing popularity of VR. I managed to get one with a little added premium, but 3 months later, I’m happy with my purchase.
In an unexpected turn of events, I’ve convinced my mother to buy an Oculus Quest, the PC-free version of VR. We frequently chat and play VR frisbee golf together. And to be honest, It didn’t take much convincing. She was hooked the moment I let her try the Rift-S. My dad’s first foray into VR boxing was unbelievably hilarious. I had to restrain him before he crashed through the TV, swinging at his virtual opponent. Observing the speed at which my parents adapted to using VR was astonishing for me. I’ve been nerding out on games since I was a child. We’ve never played games together. It made me realize that this tech will affect both the young and old, and it will encompass much more than niche gaming applications.
Along with catching up with Mom, I’ve been using my Oculus as part of my workout routine. I’ll pop in and do 15 – 20 minutes of shadow boxing in BoxVR. I purchased a 16lb weighted-vest that I wear and 1lb wrist wraps. These weight additions enhance my workout greatly. I’ve come to enjoy melee-based VR games like Until You Fall and Asgard’s Wrath. I’m definitely feeling a major physical workout after putting an hour into one of those games. The weights do an amazing job at heightening the sense of physicality in worlds where you are swinging swords and blocking blows with shields.
Asgard’s Wrath is a no-kidding RPG. There’s a full inventory system and side-quests. You play as the God of Animals and can possess a warrior called the Shield Maiden. In your colossal godly form, you can move huge objects out of the way for the other figurine-sized characters. You also have the ability to pick up animals and transform them into companions that will fight alongside you. I’ve played a few hours and it’s really a step above anything else I’ve played in terms of real video game mechanics and systems. Most VR games that I’ve played have had a relatively simple game loop as far as modern games go. I’m excited to see where the future of this technology leads.
In the spirit of VR, I recently read Ready Player One by Ernest Cline. I enjoyed the book a great deal. I’m not huge on the 80’s nostalgia factor, but I got most of the video game references and main movie and book references. I think my recent experimentation with VR tech provided analogs that helped me imagine the technology and environments being described in the book. I eagerly await the sequel to Ready Player One, which is, you guessed it: Ready Player Two, coming out in November of 2020.
I’m experimenting with VR development. I’ve dabbled with game development in the past in both the Unreal Engine and Unity, but there is something that blows me away when I create an environment in VR and can traverse around in it! More on this as I experiment more. I’ve been in touch with a VR researcher at URI and am signed up for a virtual Immerse-a-thon event coming up in late September. During this event, I will have the opportunity to work with a team to develop an educational VR or AR (Augmented Reality) app. I’m excited!
I was frustrated this morning after a Zoom meeting. We’ve all needed to make big changes on how we’ve done things since this COVID-19 thing hit. Some of us have handled change better than others. In my world, it’s been adapt or die for a long time. I know I’m being a prickly hard-ass about it, but it’s meant that I’ve been able to excel even in the hardest of times.
As a teacher, adapting during COVID-19 has meant becoming more proficient using online tools and platforms. Granted, I’m a tech guy; I use this stuff every day. Heck, I’ve even somehow convinced my mother into buying an Oculus Quest VR system so we can bowl or play frisbee golf and catch up virtually a few times a week. I get it. It’s much easier for me to jump into some of these technologies than it is for others. Here’s the thing, never in my world has it been okay for me to say, “I can’t do it.”
You find a way. If suddenly, Americans lose the ability to speak English, and we all need to speak Japanese, guess what, I’m learning how to speak Japanese. If I need to learn how to do advanced mathematics, and trust me this is the most tortuous activity I can imagine, I’d learn how to do the math. It would suck in the beginning. I would work and sweat and beat my head in. And then I’d figure it out.
I work in the non-profit world. Our resources are limited. People are performing functions that should probably be performed by twice as many staff members, but the money isn’t there, and the public needs our help. When someone can’t step up to a challenge and are not willing to find a way to push forward, they put an extraordinary amount of stress on other team members. Either do your job or step aside.
That’s where my mind was this morning. I left the meeting and jumped straight into an online ESL class (which I planned at 11 o’clock last night by the way).
I promise I’m done ranting.
Class went well, and I realized I had a few messages in the Call Me Positive hopper that needed to be posted. The first message was about how someone brought flowers to their favorite teacher and how they had this tearful exchange. The next message was about someone spending the day working in their yard, enjoying the sun, and trying to keep papers from blowing away. The final message was from another ESL teacher that I know talking about a parade in Central Falls, RI and how the community has banded together in these tough times.
I felt so much better after listening to those messages and posting them up in the podcast. Call Me Positive hasn’t blown up in terms of popularity. It’s definitely made an impact though—on those that call in and on those that listen, including me. Sometimes, I can get on this downward spiral of anger. I focus too closely on one thing, and I can’t break out of it. I know I’m wasting my emotional and mental resources by focusing on what made me angry, but I can’t get out of it. Listening to those positive messages helped me break the chain. Writing this has helped as well.
I’m proud of Call Me Positive. I hope it can help others find little moments of happiness for years to come. I’ve posted the episode that helped me feel a little better below.
I made a strawberry-cherry pie this week; nothing extreme, just canned strawberries and cherries with a Pillsbury crust. The pie isn’t as good as grandma’s but not half-bad considering its humble roots! Anyway, before I ate my pie this evening, I sprung for the deluxe treatment and capped the pastry with slices of Colby cheese and nuked it.
The result? Deliciousness. Cheese stacked on pie might be a Wisconsin thing—one of the many home-field strategies to help Wisconsinites consume more cheese. I wouldn’t put it past state lobbyists and lawmakers. After all, the state legislature banned yellow-colored margarine in the state in 1895 to protect dairy farmers’ interests.
“It all began in 1895 when the State Legislature passed a law prohibiting the manufacture or sale of yellow-colored margarine, also called “oleo,” because it was believed to be a threat to the dairy industry. It remained illegal for almost 75 years, until margarine was finally decriminalized in 1967.”
The quoted article is a fun, short read. I’m sure I could get to the bottom of my cheesy pie mystery, but some things, such as the Bermuda Triangle or what Nessie drinks on Sundays, are best left in a mystical state. Seriously though, does anybody else eat their pie with a little cheese on top? Carolyn thinks I’m crazy! Shoot me an email. I need some backup!
Other News From the Front
The PCVI Summer Writing Seminar begins tomorrow. I’m feeling confident. This evening, I reviewed our reading for this week and moved Joan Didion’s “On Keeping a Notebook” to the optional section. I don’t want to overwhelm students. Ann Lamott’s “Shitty First Drafts” moved to the top of the list for mandatory reading. Her short essay is funny and will hopefully give students confidence and energy to get words on the page. They’ll need it.
I’m kicking them out of the nest day one and demanding they bring back a story draft by our next meeting based on something they witness out in the wild. This exercise, called “I Am a Camera,” is pulled from our text, The Making of a Story by Alice LaPlante. I want students to get points on the board early in terms of finishing a draft. I’m sure the experience will be terrifying for some students, but once they reckon with the fact they’re surrounded by narrative arcs and intriguing characters, well, it might be all they need from me. The rest will be heaps of cheese.
Delicious, melted cheese.
The PCVI Summer Writing Seminar is an awesome learning experience for me, in terms of both instructional design and my writing. It feels good to perform little tweaks on this hotrod of a class—take a steel brush to the spark plugs here, grease a spring there. I hope it doesn’t explode upon takeoff!
I’m currently reading Exhalation by Ted Chiang. I’ll write a more detailed impression of the book after I’ve finished it, but I need to share this little anecdote. The first story in the book, “The Merchant and the Alchemist’s Gate” deals with the mind-bending concept of an alchemist’s gate which allows a person to travel 20 years into the future by passing through one side and travel 20 years into the past by passing through the other side. There are no Back to the Future rules here. You can say hello to your past or future self. You can interact with them in any way you’d like. No problem. As long as the gate is available, you can pass back through to your time.
You cannot alter the past or change the future. You can only observe it in greater detail. When you look at your past or future in greater detail, you might witness the fact that you’ve already traveled back and altered the past and the present you lived was a precursor to you traveling back to the past…and…Ted Chiang somehow makes this madness work.
I read “The Merchant and the Alchemist’s Gate” late at night and napped a bit on the couch, and holy crap, my brain was in overdrive trying to figure out the weird time travel logic seeds Ted Chiang had sewn. Talk about strange dreams! What an absolute master!
I finished the short story collection Tomorrow Factory by Rich Larson today. It was an excellent read with stories of varying lengths and subject matter. The story “An Evening with Severyn Grimes” brought me to this collection. I read it in Forever Magazine last week. Mr. Larson was generous enough to include a short background segment for each story. One of the inspirations for “Severyn Grimes” was the show Altered Carbon on Netflix. He nailed it. Altered Carbon came to mind immediately. Like the show, his story is violent, sexy, and cool cranked up to 11.
Here are a few stories in the collection I especially enjoyed:
“Meshed” – Ultra-talented basketball kid is being recruited by Nike and has to make a decision of whether to have a surgical sensory recording interface installed in his body or not. The interface allows others to see and feel like he does. The thing is, his family has a history with meshes and his father will not let his son take such an impactful decision lightly.
“Ghost Girl” – Young albino girl is living in a scrapyard in Burundi. An agent is sent to extract her to safety. Albino parts go for a high price on the black market and armed mercs are inbound. Good thing the girl is protected by an AI controlled, bear-sized mech. Or is it AI controlled?
“Capricorn” – Space prison, cryo-lockup, shivs, drug chemist, escape attempt, and murderous med-bot. Inspired by the game Chronicles of Riddick: Escape from Butcher Bay. Yeah, freaking awesome!
Tomorrow Factory was an engaging read start to finish. I will definitely be on the lookout for Rich Larson in the future!
The PCVI Summer Writing Seminar is set to begin this upcoming Wednesday. I’m excited. I have a new “problem.” For the first time when I’ve run a workshop or event, the demand is way more than capacity! The impending budget crisis in Rhode Island and possibility for less funding has caused me to modify components of the course, particularly the What Cheer Writer’s Club co-working membership. This is now one month for the students instead of two. Reducing co-working time gives us a little wiggle room in terms of the budget and has allowed us to add a few more additional students. We’re up to fifteen students with one more in the wings. We promised to serve ten to twelve students in the original plan.
This is great news, but I need to be careful not to over enroll and lose the ability to provide a meaningful experience. If I hadn’t meticulously planned this thing, I would be more concerned. My largest consideration is that I can provide feedback on student work in a timely matter.
There is a great need out there for this type of programming. The enrollment numbers for this class prove that. We had $260 earmarked for paid and online advertising to promote the class. We didn’t use a cent of that money. It was all organic networking. It’s difficult for me to turn someone away from this class. I know how powerful and healing writing about personal experiences can be.
I’ll be conducting final checks on week one materials tomorrow and reviewing readings for the week.