The Tool as an Extension of the Hand: Part II in the Autodidact Series

Teleoperation is the act of controlling an object that exists in a space, real or virtual, physically disconnected from the user. During such situations, it is not uncommon to observe those controlling the remote object exhibiting movement consistent with the behavior of the remote object. Though this behavior has no obvious impact on one’s control of the remote object, it appears tied to one’s intentions, thus, possibly representing an embodied representation of ongoing cognitive processes…The implications of these results not only suggest that spontaneous behavior observed during teleoperation reflects a form of visible embodiment, sensitive to task demand, but also further emphasizes the utility of natural behavior approaches for furthering our understanding of the relationship between the body and cognitive processes. (Chisolm, 609)          

Peripersonal space refers to the space closely surrounding the parts of our bodies.  In the case of the hand, this most closely is approximated by the natural reach envelope.  Studies have proven that in both humans and non-human primates alike, neurological responses to objects within the peripersonal space of the hand differ from those further out of reach. (Farne, 408)  Researchers have delved into this phenomenon, observing monkeys with and without tools.  In the simplest distillation, the subject understands the limitations of its reach (peripersonal space) when confronted with a food pellet out of its reach.  With limited training, the subject was able to acquire the use of a rake when presented with the tool, effectively extending its reach and attaining the food.  Analysis indicates that when the monkey wields the rake, its spatial awareness is extended by the length of the tool.  In short, its peripersonal space becomes that of the monkey plus the tool. (Maravita, 80)

Notably, the research into the use of simple-reach-extending tools enlarges the visual reference fields in the monkey’s brain along the rake axis, remapping distant objects as visually nearer.  This remapping, which explains the change in spatial awareness, only occurs during the active use of the tool.  In fact, passive handling of the tool actually shrinks the reference field back to the pre-tool use size.  Cross-modal extinction, the loss of expanded RF with off-handed tool use, presented as greater AFTER tool use rather than before.  (Farne, 409)  Even more remarkably, while training to use the rake to gather out of reach food pellets occupied a duration of weeks, once the basic skill was learned, monkeys were able to execute a second-order task, using a shorter rake to grasp a longer rake and retrieve pellets placed out of reach of the shorter tool, but within the grasp of the longer one.  (Maravita, 84)

The lengthening of peripersonal spatial presence offers a glimpse into first-order mediated action.  Much like a tennis player striking a ball, this type of action involves a direct contact with the item, a direct control of a proximal artifact with the body.  The second-order action might perhaps demonstrate a more significant perceptive change.  Distal objects, in this case, are acted upon by means of direct manipulation of a proximal one—a crane operator utilizing a lever, lifting materials by means of a cable and boom, never directly touched.  “Here we suggest that these two mediated actions have different effects on our experience: a successfully learned first-order mediated action produces incorporation—the proximal tool extends the peripersonal space of the subject (the subject is present in the tool)—while a successfully learned second-order mediated action produces also incarnation—a second peripersonal space centered on the distal tool.”  Further work with similar monkeys demonstrates that through the use of a Microsoft Kinect and monitor, monkeys were able to use a self-image as the distal reference and coordinate their movements to retrieve the food.  Identical neural processes applied to the coded image of the hand in the distal retrieval as in the proximal.  The image of the hand is experienced as a direct extension of the self.”  (Riva, 206-7) 

The obvious research into second-order actions lies in the teleoperation of objects in virtual space.  The concept remains the same here with the subject using proximal tactile interaction to control realize distal goals.  In the first-order interaction, extensions of peripersonal space manifest as incorporation.  In the second-order interaction, this incorporation works in conjunction with remote actions, adding an extension of extra-personal space, thus leading to incarnation.  Such incorporation leads to body-based actions related to the remote goal, but not primarily acting on either the proximal or distal artifacts.  Behavior such as this belies the concept that the same cerebral functions are being employed.  (Chisolm, 602)

To test such theories, Chisolm, et al employed a popular off-road video game in a controlled experiment.  Subjects were given either a “regular” or a “reversed” controller, wherein the on-screen vehicle responded to right directional input by moving to the right in the regular scenario or the vehicle responded to right directional input by moving to the left in the reversed scenario.  Subjects received ample practice runs to acclimate to the randomly assigned control pattern.  Then a set number of time trial runs were filmed. 

Computer analysis of the footage revealed that there existed little statistical variation between the ability to operate the vehicle with either control scheme.  Further, the researchers observed a clear relationship between the subject’s body movements and the intended goal on-screen:  When the track turned to the right necessitating the vehicle to turn right, the participants invariable leaned to the right.  Surprisingly, the control scheme had no statistical effect on the synchronicity of the body movements and the target movement.  Even with the reversed controls, while steering left, the subject leaned to the right as the track veered to the right, indicating a remote goal-driven movement.  “This finding provides support for the notion that both action plans are generated separately…and is consistent with the idea that the movement reflects an overt manifestation of second-order mediated action representations rather than first-order mediated action representations, respectively.  (Chisolm, 615)

One cannot deny the implications of the inquiries into tool use, spatial awareness, and telepresence in the realms of Virtual Reality and Augmented Reality.  But, we live in Real Reality.  First and foremost, BIM is a construct, an idea, a paradigmatic approach to delivery in the AEC industry.  Autodesk® Revit® is a tool to perform BIM and execute an idea from concept to built form.  Furthering the conceit that Revit is the tool, how can it be used as an extension of the hand in the context of an architectural practice?  Quoting the vernacular, “guns don’t kill people, people kill people,” Revit does not design anything, people using Revit do.  Coming from the context of approaching the acquisition of a software from an autodidact standpoint (see my previous article, February 2017), aspects of these explorations into presence and spatial perception inform the dialogue.  Presence is defined as the outcome of an embodied intuitive simulation of the intended action developed through practice (implicit learning).  (Riva, 203) 

Learning the software itself becomes the first order action.  Of course, we all incorporate the software into our peripersonal space.  By learning the machinations of the buttons, menus, elements therein, we associate with the abilities that they afford us to create a virtual (model) representation of our creations, annotations, schedules, dimensions, families, and sub-elements.  If Revit is a tool, similar to the Short and Long Rake exercise, the direct user must utilize (and master) the individual tools built into the program to better employ the entirety as a tool.  The first-order learning model is simple and straightforward.  However, managing a team, acquiring a new software, and maintaining momentum in a complex project with many moving parts transcends first order learning and necessitates a more global approach.

I will posit here that telepresence can exist in a physical space as much as it can in virtual and augmented space.  I will further expostulate that an architectural team essentially embodies a second-order activity wherein the team leader controls the actions of the team members, thus affecting the outcome of a distal objective.  The junior team members become the interface between the controlling element and the objective.  It puts the onus of understanding the team’s abilities, strengths, and weaknesses on the leadership. 

Although intuitive, in the construct of a second order relationship, the junior’s ability to use the tool represents the length of the tool in the monkey studies.  The perception of presence in that case becomes akin to the understanding of the length of the tool.  The perception, in our practice, is that of every member at a differing level.  To the junior, ability to use the software is the metric; for the leadership, the best use of each team member is the crucial understanding.  In the deployment of each tool within the construct, greatness can result.  The incorporation of the various talents together leads to incarnation of a stellar team.

Sources Cited:

Alessandro Farnè Dr , Silvia Bonifazi & Elisabetta Làdavas (2005) The Role

Played by Tool-Use and Tool-Length on the Plastic Elongation of Peri-Hand Space: A Single Case

Study, Cognitive Neuropsychology, 22:3-4, 408-418, DOI: 10.1080/02643290442000112.

Angelo Maravita and Atsushi Iriki (2004) Tools for the body (schema), TRENDS in Cognitive Sciences, Vol. 8, No. 2, 79-86.

Giuseppe Riva and Fabrizia Mantovani (2012) From the Body to the Tools and Back: A General Framework for Presence in Mediated Interactions, Interacting with Computers, 24, 203-210.

Joseph D. Chisholm, Evan F. Risko & Alan Kingstone (2014) From Gestures to

Gaming: Visible Embodiment of Remote Actions, The Quarterly Journal of Experimental Psychology, 67:3, 609-624, DOI: 10.1080/17470218.2013.823454.

Andy Fastman, AIA, LEED AP BD+C, has worked in various facets of the architectural world for nearly the last 20 years.  Educated at the Georgia Institute of Technology, Ecole d’Architecture Paris, La Vilette (formerly Beaux Arts) and University of California, Los Angeles, he maintains one foot staunchly in the academic realm.  Andy has worked for a variety of firms in the Los Angeles area including Gehry Partners, Jerde Partnership, and Ball-Nogues Studio.  He recently returned to Cuningham Group Architecture’s Big Play Group which focuses on big name entertainment and theme park designs.  He also teaches at OTIS college of art and design and is affiliated with the New York City College of Technology’s Advanced Design online studio critic project.  With only 14 months of Revit experience under his belt, Andy continues to learn (while teaching) and extend his reach on a daily basis.  

Appears in these Categories