BiochemAR, an augmented reality app for visualizing 3d molecular models, is now available for download on Apple’s App Store and Google Play Store. This app, a collaboration between Rou-Jia Sung (biology) and Andrew Wilson (AT), also includes learning modules and ways to use the app in the classroom. To read more, checkout this write-up in The Scientist. If you’re interested in more information or talking through developing additional modules, please email Rou-Jia (email@example.com) or Andrew (firstname.lastname@example.org) directly.
This blogpost has been a long time coming. I have meant to write about our ongoing Hololens developments for some time. I wanted to start by saying, even after over a year with the Hololens, it still really excites me over all of the other VR/AR technology currently available. Since I last posted we have purchased three more Hololens. This expansion was to enable multi-user experiences, something which I think makes the Hololens and AR stand out from VR in a classroom environment. These extra Hololens have helped me to work on two fascinating projects; Spectator-view and Share Reality view, both utilizing multiple units.
We have had the Hololens for over a year now and only have one video demonstrating it. This is due to how difficult it is to record the AR via the Hololens. Microsoft thought of this and created Spectator-View. The spectator-view allows you to plug in a digital camera and Hololens into a computer and stitch together the images from both. This means you can record the Hololens at much higher resolution. But to do this, you need a second Hololens and a mount to hold it onto the digital camera. So second Hololens, check, Hololens mount, check (see the picture, I 3D printed one over the summer). Now came the hard part. Although Microsoft has created the software for Spectator-View, they don’t package it up in a nice easy application. You have to build it yourself via the source code. After a few hours of debugging, I finally got all of the required applications working. This is our current setup.
I am looking forward to making some new Hololens videos.
Share Reality view
The second package I have been working on is a shared reality experience where the users get to explore an archaeology site, Bryn Celli Ddu, and its associated data. Similar to the spectator view, Share Reality allows each Hololens user to see the same hologram within the same space. This will enable us to create shared experiences, for teaching this is a vital tool. Being able to all see and interact with the same object within in the same space. This adds a whole new level to AR allowing for more social interaction, not isolating the user in their own `realities’ like VR or single user experiences.
This share reality experience was demoed at GIS day.
Image caption: (l-r) Thabiti Willis, Jack Gieseking, Adriana Estill in conversation. Photo by Briannon Carlsen.
Author: Nathan Mannes, ’19
With supplies from the Geology Department and with the advising of Andrew Wilson, we have created an Arduino-based water-depth monitor. The grey cone you see at the bottom of the photo is a sonar-device that measures how far the closest solid object in front of it is. That could mean a wall, but we intend to put it over a body of water, like Lyman Lakes, to measure its depth over a long period of time with little maintenance. Because it is solar powered, we can leave it outside and let it send readings on its own.
On the right side you see a 3G shield module (with the antennae) mounted on an Arduino. It uses mobile data to send readings over the internet. But it has to send data to somewhere, right? We are setting up a public-facing webserver so that we can keep track of this data long-term. Then, much like the water tower, we will always be able to check what the depth of the Lyman Lakes are. In the future, we intend to expand this to conduct other readings on the water, like its pH or temperature, or volume of flow.
Andrew, Dann, and Janet presented at the Online Learning Consortium Innovate! Conference in Nashville. Their talks were (respectively):
- Adding a New Dimension to Protein Structures: student perceptions of augmented reality in a biochemistry course,
- Planning, Producing, and Evaluating Instructional Video, and
- Is online teaching and learning relevant for small residential liberal arts colleges?
Dann’s notes from sessions he attended are summarized below:
Hey Folks, spring is on us. Here is a little of what I’ve been up to, and what I’m looking forward to.
Leaning on my MFA in Digital Cinema and 15 years of teaching experience, I’ve designed a stand-alone two credit course focused on Civic Engagement and Documentary Filmmaking that I’ll be co-teaching with the impressive Palmar Alvarez-Blanco here at Carleton. The curriculum can actually be coupled with nearly any course, pairing students with community organizations that need greater support and visibility. Students will spend the term researching, meeting with, and interviewing members of these community organizations, and then . . . giving a tangible video resource back to that community organization. We’ll cover topics such as bias recognition, visual storytelling strategies, interview techniques, non-linear editing, and social media marketing. This is going to be a fun and engaging class that results in rich civic engagement, valuable documentary filmmaking experience, and a concrete and useful video for several community organizations.
I’m also going to hit the road this spring presenting at conferences including OLC, the Online Learning Consortium, in Nashville Tennessee and at Innovate! Teaching with Technology conference at the University of Minnesota Morris. I’ll be presenting sessions on Planning, Producing, and Evaluating Instructional Video, and Creating Effective Instructional Videos, and I’ll be co-leading a discussion on Online Teaching and Learning for Small Liberal Arts Schools with my colleagues Janet Russell and Andrew Wilson.
Spring is also exciting because one of my personal projects–a compact teleprompter I call the Little Prompter, is ready to hit the market. Over the past year, I worked with a creative and crafty colleague on the design (Thanks, Eric Mistry up at St. Scholastica!); I then ran a successful fundraising campaign to get it manufactured, and am now ready to market and sell it. The Little Prompter is more than just a pet-project, too. It’s got great pedagogical value. Even for experienced instructors, delivering a lesson on camera can be a little intimidating–and even minor discomfort and hesitation on camera can greatly impact how a viewer perceives the speaker and how long a viewer stays engaged with the content. Now, with the Little Prompter and a little pre-planning, faculty can flawlessly deliver their lesson directly into the camera—improving eye-contact and viewer retention. Faculty here at Carleton (and around the world) can learn more about the Little Prompter—and even order one for yourself at www.littleprompter.com.
Fall and winter terms were an exciting time for me, with the arrival of our new 3D printer and the in-class trial of one of my Augmented Reality (AR) applications. Spring term will be just as exciting but a bit more virtual for me, as I will be spending time developing virtual experiences for Psychology and making virtual proteins a reality.
Spring term will also see more development and another full trial of our Biochemistry AR application. Working together with Rou-Jia Sung, we will be developing additional modules for use within the Intro to Biochemistry course this term. On this front, we will also be applying for a NSF grant to fund further research into the use of AR within a classroom setting. Excitingly, the AR application will be presented twice this term at the Online Learning Consortium (OLC) in Nashville and at the Society for the Advancement of Biology Education Research (SABER).
Spring will also be an exciting time for me personally. Now I am settled in Carleton, and having worked with the wonderful librarians, I am about to embark on writing my third book Visualizations in Cultural Heritage. The book will look at the history and development of the multitude of visualizations employed within the Cultural Heritage field.
This term I’m looking forward to more sunshine and outdoor running! But I’m also looking forward to the data collection phase for a few of my research projects. I’ve got quite a busy term ahead of me!
I’m working with Asuka Sango (Religion) on implementing some gamification techniques into her Zen Buddhism course. The goals of this project are to provide students with a positive reinforcement model for participation in good study behaviors and optional components in her course. While research suggests that gamification works well, it will be interesting to see what we can learn about the efficacy of gamification in a small humanities course.
I’m also stepping up my work with Language Lesson, a software that I designed as a practice tool for foreign language speaking exercises. This year I’m delving deep into the field of acoustic phonetics and digital signal processing to try to introduce intelligent features based on second language acquisition research. I will be presenting on the development of Language Lesson and the implementation of pitch graph display at the next CALICO conference in late May.
On this project, I’m collaborating with Andrew Wilson, who is helping to manage a team of student developers to realize this project. I’m excited that these students are getting some practice with software development and experience with tools used in industry.
Amongst all of this, I’m also traveling to Japan in April to participate in the International Kyudo Federation’s International Kyudo Seminar and Shinsa (rank examination). I’ll be learning, taking a rank examination and volunteering as an interpreter. It’s going to be an exhausting trip, but I appreciate the opportunity to visit Japan and make use of my language skills to help others learn.
Join us for this term’s series of [un]Workshops:
10/17 (Tue) – 3-4p – Leighton 426
Title: Social Reading and Notetaking: An Overview of Web Annotation Tools
Blurb: A foundational activity for many courses is the critical reading of texts. This [un]workshop will demo and discuss how annotation of online texts (newspaper articles, scholarly pieces, pop culture artifacts) can help students better understand how scholars read, analyze, and synthesize different kinds of written materials. Please bring your laptop if you would like to follow along with the demonstration of Hypothes.is and PRISM.
10/26 (Thu) – 3-4p – LDC 104
Title: Oral feedback in Language Lesson
Blurb: Providing contextualized feedback to students on their foreign language production is well-known to be a big contributor to student success. This [un]workshop will demonstrate Language Lesson, a tool designed to facilitate student recording of speaking exercises, and to allow instructors to respond by placing oral feedback directly into the recordings. Language Lesson is currently being piloted with several language classes this term, but has the potential to be used with audio recording exercises in any discipline.
11/02 (Thu)- 12-1p – LDC 104
Title: GIS, Spatial Analysis and You: Mapping your research data!
Blurb: Geographic Information Systems (GIS) have become important tools within research, but can also give your datasets critical spatial contexts. This [un]workshop will give a basic introduction to GIS and spatial analysis, discussing what is possible with spatial databases for both traditional and non-traditional contexts.
Time for my second post. This post is a lot later than expected; I still haven’t got this blogging down yet.
As part of the fun new tech we have been purchasing at Carleton, we managed to get a hold of a Hololens. Unlike the HTC Vive, which is VR, the Hololens is AR (Augmented Reality). The Hololens is an impressive piece of kit and one I am the most excited about. According to Microsoft (its developer), the Hololens is “the first self-contained, holographic computer, enabling you to engage with your digital content and interact with holograms in the world around you.” In normal terms, it is a tiny computer attached to a set of glass lenses, which look like a very futuristic headset.
These lenses are where the magic happens. The Hololens has three layered screens for Red, Green and Blue channels, which are combined to render full-color objects. The onboard computer uses an inertial measurement unit to calculate the location of you and the “holographic” object within your surrounds. This technology work in a similar way to AR on your cell phone with games like Pokemon Go and Ingress.
The Hololens opens up some fascinating teaching possibilities. Unlike the Vive and VR, which is very isolating and a single users experience, the Hololens and AR can be developed to be a multi-user experience. This multi-user experience enables to each Hololens to view the same 3D, providing some exciting possibilities within the class.
One of the first projects we worked on was to develop an AR model of the Piper J3 Cub used to train Carleton students in the 1940-50s. This was a part of a museum display for Sesquicentennial celebrations. The original idea of this project was to utilize the VR and HTC Vive, but I felt the Hololens would be more fun for visitors and would still allow them to be present within the space. Thank you to PEPS for editing one of my favorite videos using the Hololens.
Video from Piper Cub J3 (https://vimeo.com/189338455). Watch this space for more fun videos!