Redesigning Education

Meet your new co-worker: A wood-crafting robot

Though there are huge differences in how humans and robots work, there are similarities in how they learn new crafts. This has significant benefits for using them in design practices.

When it comes to craft, humans and robots are clearly very different machines. A human, a sculptor working with wood for example, changes position frequently, moving silently around the piece that they are carving. The only noise they make is the soft, repeated slicing of the chisel into the wood. The robot, designed for heavy manufacturing, is usually stationary. It is capable of exerting crushing force and emits a low background hum of whirr as it extends its “arm” and manipulates the chisel at its tip.

The physical differences may be extreme but they are both learning the craft in similar ways. Humans refine their craft through years of practice, developing an intuitive “feel” for the wood. Knowing how to optimise the changing angle and pressure of the chisel depending on the density of the material or orientation of the grain is something that cannot be described explicitly. A human apprentice first observes the master, and then develops a similar intuition, but one that is tied to a different body and possibly different tools.

hands working with input tools, mirrored by a chisel on a laptop

Similarly, the robot must learn for itself. The first stage of machine learning is to track the master craftsperson’s movements, recognising the patterns in these and making a reasonable copy of the chisel’s path, given varied species of wood, and a very different robotic arm. In the second stage, the robot becomes independent, and refines its skill by trial and error. It makes many cuts, experimenting to see which are optimal, and which splinter or jam. From these it develops its own knowledge, a similar intuition and “feel” for the process.

Our research used real-world fabrication data, collected by human experts and autonomous robotic sessions, to derive a more accurate geometrical prediction of carving operations on timber. This consisted of a series of training procedures for a robotic fabrication system, where the instrumental and material knowledge of skilled human craftspeople is captured, transferred, robotically augmented and finally integrated into an interface that makes this knowledge available to designers.

A robot chisel pulls away a spiral of wood

In a series of carving sessions, a dataset was collected using an array of motion capture cameras: tool to surface angle, tool to grain direction angle, force feedback, input cut length and input cut depth. These were paired with their respective measured outcomes such as actual length and depth of the cut, gathered using 3D-photogrammetric techniques to reconstruct in a highly detailed mesh geometry the result of the carving operations. These collected datasets were used to train an artificial neural network, or ANN, which predicted the geometric outcome of the subtractive operation from a user-defined toolpath and the series of fabrication parameters described above, as well as generating a robotic toolpath out of a digitally carved geometry.

machine sensors

“The training workflow should not be considered a linear progression from the recording to the fabrication stage but rather as a knowledge platform.”

Each robotic toolpath is a sequence of target frames, which define the position and orientation of the carving gouge along the cut. Given a sequence of target frames, the trained ANN predicts, at each frame, the geometric output parameters of the cut (length, width, depth), considering the influence of material properties determined by the wood species (i.e., grain arrangement and density).

The trained ANN is a knowledge that can be transferred, re-used, extended and, most importantly, integrated within an interface to digitally evaluate multiple design solutions informed by tools and material properties before moving to the production stage. The training workflow should not be considered a linear progression from the recording to the fabrication stage but rather as a knowledge platform that can be remodeled over several cycles with new fabrication data, trained to improve its prediction performance and applied to various design tasks.

a step-like pattern chiselled into wood

Previous research has explored machine learning models being applied to optimise robotic tasks within an industrial context. The novelty of this research lies in applying similar established methods within the workflow of creative practices to augment and support the abilities of designers. During collaborations with ROK Architects and the Danish practice, BIG, the technique was introduced into established workflows and through this a catalogue of design explorations for a wide range of applications was developed, from furniture to building components of larger assemblies.

eye-like wooden shapes

Creating a methodology so robotic craft could work within a design environment shows that interacting with fabrication tools and material affordances early on in any process makes for better-informed design decisions. For designers, the access to packages of instrumental knowledge enables manufacturing techniques to be extended, as the trained networks significantly increase accuracy in the prediction and simulation of non-standard processes. Designers willing to engage with the process can create custom-designed manufacturing workflows, which are validated by feedback data and statistical models.

For companies, the research demonstrates the advantages of packaging knowledge, making it available to all the stakeholders involved in the design-to-manufacturing workflow, to ensure fruitful communication from the outset and to help avoid costly mistakes at a later stage.

Share

Giulio Brugnaro

Research Assistant, The Bartlett School of Architecture

Prof Sean Hanna

Professor of Design Computing, The Bartlett School of Architecture

Prof Bob Sheil

Director and Professor of Architecture and Design through Production, The Bartlett School of Architecture

Read Next

0/0