Image caption: (l-r) Thabiti Willis, Jack Gieseking, Adriana Estill in conversation. Photo by Briannon Carlsen.
Image caption: (l-r) Thabiti Willis, Jack Gieseking, Adriana Estill in conversation. Photo by Briannon Carlsen.
Author: Nathan Mannes, ’19
With supplies from the Geology Department and with the advising of Andrew Wilson, we have created an Arduino-based water-depth monitor. The grey cone you see at the bottom of the photo is a sonar-device that measures how far the closest solid object in front of it is. That could mean a wall, but we intend to put it over a body of water, like Lyman Lakes, to measure its depth over a long period of time with little maintenance. Because it is solar powered, we can leave it outside and let it send readings on its own.
On the right side you see a 3G shield module (with the antennae) mounted on an Arduino. It uses mobile data to send readings over the internet. But it has to send data to somewhere, right? We are setting up a public-facing webserver so that we can keep track of this data long-term. Then, much like the water tower, we will always be able to check what the depth of the Lyman Lakes are. In the future, we intend to expand this to conduct other readings on the water, like its pH or temperature, or volume of flow.
Andrew, Dann, and Janet presented at the Online Learning Consortium Innovate! Conference in Nashville. Their talks were (respectively):
Dann’s notes from sessions he attended are summarized below:
Hey Folks, spring is on us. Here is a little of what I’ve been up to, and what I’m looking forward to.
Leaning on my MFA in Digital Cinema and 15 years of teaching experience, I’ve designed a stand-alone two credit course focused on Civic Engagement and Documentary Filmmaking that I’ll be co-teaching with the impressive Palmar Alvarez-Blanco here at Carleton. The curriculum can actually be coupled with nearly any course, pairing students with community organizations that need greater support and visibility. Students will spend the term researching, meeting with, and interviewing members of these community organizations, and then . . . giving a tangible video resource back to that community organization. We’ll cover topics such as bias recognition, visual storytelling strategies, interview techniques, non-linear editing, and social media marketing. This is going to be a fun and engaging class that results in rich civic engagement, valuable documentary filmmaking experience, and a concrete and useful video for several community organizations.
I’m also going to hit the road this spring presenting at conferences including OLC, the Online Learning Consortium, in Nashville Tennessee and at Innovate! Teaching with Technology conference at the University of Minnesota Morris. I’ll be presenting sessions on Planning, Producing, and Evaluating Instructional Video, and Creating Effective Instructional Videos, and I’ll be co-leading a discussion on Online Teaching and Learning for Small Liberal Arts Schools with my colleagues Janet Russell and Andrew Wilson.
Spring is also exciting because one of my personal projects–a compact teleprompter I call the Little Prompter, is ready to hit the market. Over the past year, I worked with a creative and crafty colleague on the design (Thanks, Eric Mistry up at St. Scholastica!); I then ran a successful fundraising campaign to get it manufactured, and am now ready to market and sell it. The Little Prompter is more than just a pet-project, too. It’s got great pedagogical value. Even for experienced instructors, delivering a lesson on camera can be a little intimidating–and even minor discomfort and hesitation on camera can greatly impact how a viewer perceives the speaker and how long a viewer stays engaged with the content. Now, with the Little Prompter and a little pre-planning, faculty can flawlessly deliver their lesson directly into the camera—improving eye-contact and viewer retention. Faculty here at Carleton (and around the world) can learn more about the Little Prompter—and even order one for yourself at www.littleprompter.com.
Fall and winter terms were an exciting time for me, with the arrival of our new 3D printer and the in-class trial of one of my Augmented Reality (AR) applications. Spring term will be just as exciting but a bit more virtual for me, as I will be spending time developing virtual experiences for Psychology and making virtual proteins a reality.
Spring term will also see more development and another full trial of our Biochemistry AR application. Working together with Rou-Jia Sung, we will be developing additional modules for use within the Intro to Biochemistry course this term. On this front, we will also be applying for a NSF grant to fund further research into the use of AR within a classroom setting. Excitingly, the AR application will be presented twice this term at the Online Learning Consortium (OLC) in Nashville and at the Society for the Advancement of Biology Education Research (SABER).
Spring will also be an exciting time for me personally. Now I am settled in Carleton, and having worked with the wonderful librarians, I am about to embark on writing my third book Visualizations in Cultural Heritage. The book will look at the history and development of the multitude of visualizations employed within the Cultural Heritage field.
This term I’m looking forward to more sunshine and outdoor running! But I’m also looking forward to the data collection phase for a few of my research projects. I’ve got quite a busy term ahead of me!
I’m working with Asuka Sango (Religion) on implementing some gamification techniques into her Zen Buddhism course. The goals of this project are to provide students with a positive reinforcement model for participation in good study behaviors and optional components in her course. While research suggests that gamification works well, it will be interesting to see what we can learn about the efficacy of gamification in a small humanities course.
I’m also stepping up my work with Language Lesson, a software that I designed as a practice tool for foreign language speaking exercises. This year I’m delving deep into the field of acoustic phonetics and digital signal processing to try to introduce intelligent features based on second language acquisition research. I will be presenting on the development of Language Lesson and the implementation of pitch graph display at the next CALICO conference in late May.
On this project, I’m collaborating with Andrew Wilson, who is helping to manage a team of student developers to realize this project. I’m excited that these students are getting some practice with software development and experience with tools used in industry.
Amongst all of this, I’m also traveling to Japan in April to participate in the International Kyudo Federation’s International Kyudo Seminar and Shinsa (rank examination). I’ll be learning, taking a rank examination and volunteering as an interpreter. It’s going to be an exhausting trip, but I appreciate the opportunity to visit Japan and make use of my language skills to help others learn.
Join us for this term’s series of [un]Workshops:
10/17 (Tue) – 3-4p – Leighton 426
Title: Social Reading and Notetaking: An Overview of Web Annotation Tools
Blurb: A foundational activity for many courses is the critical reading of texts. This [un]workshop will demo and discuss how annotation of online texts (newspaper articles, scholarly pieces, pop culture artifacts) can help students better understand how scholars read, analyze, and synthesize different kinds of written materials. Please bring your laptop if you would like to follow along with the demonstration of Hypothes.is and PRISM.
10/26 (Thu) – 3-4p – LDC 104
Title: Oral feedback in Language Lesson
Blurb: Providing contextualized feedback to students on their foreign language production is well-known to be a big contributor to student success. This [un]workshop will demonstrate Language Lesson, a tool designed to facilitate student recording of speaking exercises, and to allow instructors to respond by placing oral feedback directly into the recordings. Language Lesson is currently being piloted with several language classes this term, but has the potential to be used with audio recording exercises in any discipline.
11/02 (Thu)- 12-1p – LDC 104
Title: GIS, Spatial Analysis and You: Mapping your research data!
Blurb: Geographic Information Systems (GIS) have become important tools within research, but can also give your datasets critical spatial contexts. This [un]workshop will give a basic introduction to GIS and spatial analysis, discussing what is possible with spatial databases for both traditional and non-traditional contexts.
Time for my second post. This post is a lot later than expected; I still haven’t got this blogging down yet.
As part of the fun new tech we have been purchasing at Carleton, we managed to get a hold of a Hololens. Unlike the HTC Vive, which is VR, the Hololens is AR (Augmented Reality). The Hololens is an impressive piece of kit and one I am the most excited about. According to Microsoft (its developer), the Hololens is “the first self-contained, holographic computer, enabling you to engage with your digital content and interact with holograms in the world around you.” In normal terms, it is a tiny computer attached to a set of glass lenses, which look like a very futuristic headset.
These lenses are where the magic happens. The Hololens has three layered screens for Red, Green and Blue channels, which are combined to render full-color objects. The onboard computer uses an inertial measurement unit to calculate the location of you and the “holographic” object within your surrounds. This technology work in a similar way to AR on your cell phone with games like Pokemon Go and Ingress.
The Hololens opens up some fascinating teaching possibilities. Unlike the Vive and VR, which is very isolating and a single users experience, the Hololens and AR can be developed to be a multi-user experience. This multi-user experience enables to each Hololens to view the same 3D, providing some exciting possibilities within the class.
One of the first projects we worked on was to develop an AR model of the Piper J3 Cub used to train Carleton students in the 1940-50s. This was a part of a museum display for Sesquicentennial celebrations. The original idea of this project was to utilize the VR and HTC Vive, but I felt the Hololens would be more fun for visitors and would still allow them to be present within the space. Thank you to PEPS for editing one of my favorite videos using the Hololens.
Video from Piper Cub J3 (https://vimeo.com/189338455). Watch this space for more fun videos!
This month, Sarah Calhoun and I attended dh2017 in Montreal to present a prototype augmented reality app co-developed with Andrew Wilson and Adam Kral. Our poster and additional resources are linked here, but here’s the synopsis:
Our goal was to create an augmented reality app that could better visualize complex and multiple temporalities AND be an easy reusable resource for classroom use. We chose a mural painted in a Thai Buddhist temple in the UK as our case study because of its layered iconography: the mural depicts the Buddha’s defeat of Mara, but the painter chose to include anachronistic elements including machine guns, Vincent Van Gogh, and a rocket ship. We wanted a way to highlight both the historical references, which could be plotted along a traditional chronological timeline, and the temporality of Buddha’s history which could not.
We got useful and positive feedback from the poster session at dh, as well as additional ideas for refining and extending the app from attending several sessions. Our next steps are to clean up some of the identified bugs and do several rounds of user testing with faculty, staff, and students to clarify how we proceed.
Kral, a rising sophomore, did the bulk of the development work over the summer: learning Unity and building it out in AR Toolkit. His account of what he built is posted here, and we plan to continue building on Adam’s work and thank him for his efforts!
I’m thrilled to say that Andrew Wilson, Sarah Calhoun, and I had our poster proposal accepted for dh2017 in Montreal! We’re experimenting with augmented reality for representing complex temporalities in Buddhist temple murals, and creating lower barrier to entry teaching modules using AR.
Our poster will outline our theoretical framework, detail our development process using Vuforia, and provide possible avenues for further lines of inquiry and applications for temporal visualizations. We’ll include static images of the AR experience, as well as ways to access our project remotely.
We identify two main problems that this initial experiment will address. The first is the issue of visualizing multiple temporalities. Our motivating questions are: what are the visual and spatial relationships between the chronological story of the Buddha defeating Mara given how some Buddhists believe that the Buddha is personal and eternal and always present throughout time? How is that expressed in the mural through a wide range of artistic styles and historical references? These questions will be answered through the course of our research.
The second problem is a more practical question of how to use augmented reality to further research and teaching of these complex cultural concepts when both the visual and technical resources are limited. We intend to use the extant low-res photographs available of the Defeat of Mara temple mural and the augmented reality framework Vuforia to create a cross-platform experience of the religious expression. This will allow users to see and select individual elements in the mural (such as the Mona Lisa or the spaceship) and engage with the different ways one can order and make meaning out of the varied chronologies and temporal references. Vuforia allows us to use an existing framework that has the benefit of being accessible on multiple platforms. We believe this is necessary for facilitating the adoption of augmented reality for classroom and preliminary research uses.