Marchillo, GIS honors major, works Augmented Reality Sand Table

Story and photos by Kathy Eastwood Staff Writer

December 14th, 2017 | News, News and Features
A demonstration of a Virtual Sand Table took place Dec. 6. The Sand Table is a project that Class of 2018 Cadet Jake Marchillo has been working on for three months in support of the Army Research Lab in Orlando, Fla. The low-silica sand is connected to a sensor, which allows the sand to be manipulated to correct dimensions or other measurements to reflect what is on a projector. The contour map used is the West Point time lapse (1910 to present) changing topography.
Capt. Gabe Powell demonstrates the virtual Sand Table Dec. 6, a project that Class of 2018 Cadet Jake Marchillo has been working on for three months in support of the Army Research Lab in Orlando, Fla. The Sand Table is used to teach terrain features.

Class of 2018 Cadet Jacob Marchillo, a geospatial information science honors major, constructed an Augmented Reality Sand Table (ARES) as part of his capstone project. Marchillo is working closely with Capt. William Gabe Powell, independent study advisor and course director of Remote Sensing (EV377) and Advanced Remote Sensing (EV477) in the Department of Geography and Environmental Engineering, to build a sand table as a teaching tool for students and cadet development.
Marchillo is attempting to construct and apply an augmented reality system for the display of geospatial data, or the elevation data and aerial imagery of Pointe du Hoc, the 100-foot cliff overlooking the English Channel on the coast of Normandy in France.
ARES will be utilized to visualize the physical terrain of Pointe du Hoc to better understand the terrain’s impact on this portion of the D-Day invasion. This project is in support of a larger project to create an augmented reality Pointe du Hoc experience in support of ‘The History of Military Art.’
The greater project is a collaborative effort of cadets from the History, Geography and Environmental Engineering, Systems Engineering and the Electrical Engineering and Computer Science Departments.
Marchillo became interested in the project because of his Advanced Individual Academic Development (AIAD) experience at the Simulation and Training Technology Center, a division of the Army Research Laboratory’s Human Engineering and Research Directorate in Orlando, Florida.
“It was a very cool experience,” Marchillo said. “I was able to tour the whole facility and learn about all their different projects. My favorite part was the augmented reality training, but I focused mostly on the sand table and other projects. I learned the basic functionality of the sand table and the new software moving forward. The sand table will soon enter the (completely digital realm) and get rid of the sand.”
Marchillo said the Army Research Laboratory provided the manual step-by-step process of creating the sand table.
“I had all of the hardware by lesson one,” Marchillo explained. “I started assembly immediately and we hit a big bump with installing the software due to security concerns with the U.S. Military Academy network. Capt. Powell took control at that time communicating directly with ARL to get the solution. I just didn’t have the time to coordinate and get it done. But he was tough and always stayed on top of me to get my tasks done right and on time. He has been a great mentor.”
Marchillo said setting up the software was the hardest thing to do and took 20 man-hours or 2 1/2 weeks.
“Getting the Kinect sensor and projector to work was hard,” Marchillo said. “The projector doesn’t sense anything, it just projects the image onto the sand, but lining it up and calibrating it was difficult. It took work from Powell to help.”
Powell explained that the ARES invigorates the learning environment through an interactive, sensory experience to enable visualization of people, places and the environment through the three dimensional display of geospatial information.
It excels at depicting the physical landscape’s effect on the cultural landscape and vice versa. Powell demonstrated how the sand table works using West Point maps and imagery time lapse (1910 to present) of the changing topography of West Point and explained the different components of creating a sand table.
“The sand table uses highly reflective low-silica sand, which provides tactile interaction for students and gives the cadets the ability to physically interact with the sand as they are studying a specific phenomenon,” Powell said. “We used an Xbox Kinect sensor that had been modified to measure the distance from the sand. The Kinect alone provides several teaching points for our Geospatial Information Science program; measuring the distance to the sand has many of the same issues as imaging the earth from a satellite. The Kinect sensor contains a monochrome CMOS, or complementary metal-oxide semiconductor, depth sensor and an infrared projector that measures the height of the sand by transmitting invisible near-infrared light and capturing its ‘time of flight’ after it reflects off the sand.
The Kinect sensor scans the surface of the sand and detects user manipulation of the sand. Then the ARES software creates a map of the sand topography and provides feedback to the student. This allows ARES to dynamically respond to the changing shape of the sand based on user interaction such as when building up a mountain or hill on the sand. The sensor will reflect that on the table and computer screen.
Powell said that in the spring, ARES will be utilized across all three GENE programs, e.g., Geospatial Information Science, Geography and Environmental, to instruct cadets in EV203 (Physical Geography) on terrain features as part of the geomorphology block of instruction instruct cadets in EV388a (advanced Physical Geography) on 3-D geologic structure in relation to the physical terrain and to capitalize on the use of the Kinect infrared sensor to instruct cadets in EV377 regarding the effects of collection geometries on satellite and aerially acquired images.