0% found this document useful (0 votes)
73 views8 pages

Interactive Robotic Brick Design

This paper investigates a computer vision-based design and fabrication setup using a collaborative robotic arm. Through visual analysis and machine learning based framework, this paper intends to contribute in human robot collaboration in construction. The work presented establishes methods for enriching interaction between mason and robot while advocating 'thinking through making'. This foundation explores material attributes with robots on-site.

Uploaded by

madatdesign
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
73 views8 pages

Interactive Robotic Brick Design

This paper investigates a computer vision-based design and fabrication setup using a collaborative robotic arm. Through visual analysis and machine learning based framework, this paper intends to contribute in human robot collaboration in construction. The work presented establishes methods for enriching interaction between mason and robot while advocating 'thinking through making'. This foundation explores material attributes with robots on-site.

Uploaded by

madatdesign
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Designing with a Robot

Interactive methods for brick wall design using computer vision

Avishek Das1 , Isak Worre Foged2 , Mads Brath Jensen3


1,2,3
Department of Architecture, Design and Media Technology, Aalborg Univer-
sity, Denmark
1,2,3
{adas|iwfo|mbje}@[Link]

The deterministic and linear nature of robotic processes in architectural


construction often allows no or very little adjustments during the fabrication
process. If any need for modification arise the process is usually interrupted,
changes are accommodated, and the process is resumed or restarted. The rigidity
in this fabrication process leaves little room for creative intervention and human
activities and robotic process are often considered as two segregated
[Link] paper will present and discuss the methodological and design
challenges of interactive robotic fabrication of brickwork with an industrial
robotic arm, a webcam and bricks with varying color tones. Emphasis will be on
the integration of external computer vision libraries within Rhino Grasshopper to
augment the interactive robotic process. The paper will describe and demonstrate
a framework comprising (1) robotic pick and place, material selection and
evaluation using computer vision, (2) interactive robotic actuation and (3) the
role of human input during a probabilistic fabrication-based design process.

Keywords: interactive robotic fabrication, human robot collaboration, computer


vision, masonry, machine learning

INTRODUCTION This paper investigates a computer vision-based


Deterministic nature of robotic fabrication processes design and fabrication setup using a collaborative
usually leaves less room for adjustments compared to robotic arm. Through visual analysis and machine
manual crafting. In the need of any minor changes or learning based framework, this paper intends to con-
adjustments the process is often interrupted and af- tribute in human robot collaboration in construction.
ter the accommodation of the changes the process The work presented in the paper constitutes a set
the restarted. This rigidity in the workflow seldom of methods in a sequence to enrich interaction be-
leaves any room for creative adjustments in the fab- tween mason and a robot while advocating “thinking
rication. Design and fabrication often become two through making”(Ingold, 2013). This work, while per-
different and disjoint processes. formed in a controlled indoor environment, is a foun-

D2.T10.S2. ROBOTIC TECTONICS, AUTOMATION AND INTERACTION - Volume 2 - eCAADe 38 | 605


dation for on-site robotic collaboration and exploring sion methodologies to determine the pick and place
material attributes with robots. position of the bricks; secondly, a machine learning
Former applications on computer vision and ma- model will be explained to categorize the colour of
sonry/stacked structure can be categorized into two the brick. Further, it will present a robotic design
groups; 2D image based visual analysis and stack- method based on the previous methods to imitate
ing (Dörfler, Rist and Rust, 2013) and 3D point cloud human-like aesthetics in a brick wall.
based(Elashry and Glynn, 2014). In the first case, the
Figure 1
computer vision was used for picking up objects by
Framework of the
extracting position from the image feed. In the sec-
system of the
ond case, computer vision was used to fine tune a
experiment
robotic brickwork by removing excess mortar while
mimicking human behaviour. For human robot inter-
action, there has been significant number of works
in past years but particularly in masonry the AR-VR
based collaborative models (Johns, Anderson and
Kilian, 2020) have been useful to classify the modes
of collaboration. Application and implications of hu-
man robot collaboration using a novel framework has
been discussed by Dubor et al (Dubor et al., 2016) and
Moorman et al (Moorman et al., 2016)emphasizing METHODS & SETUP
the importance of sensing and feedback into a deter- The project uses a combination of computer vision,
ministic task framework. machine learning, robot control and computational
Potential solutions for the issue might be rela- design to augment a physical design setup to exper-
tively new to architecture and construction but has iment with the collaborative robotics. (1) Computer
been used extensively in industrial robotics. Com- vision (CV) and machine learning(ML) is used to de-
puter vision to detect and pick up objects has been lineate the brick in a physical space and identify the
commercialized by many companies and marker colour for the robot control and computational de-
based detection has also been used extensively as sign system to use. (2) The computational design
mean of localization in recent times(Handle| Boston receives the colour data and brick number from the
Dynamics, 2020)(Pick | Boston Dynamics, 2020). Ma- previous step to perform a user defined distribution
chine learning and colour detections has also been over the brick wall. (3) Combined with CV and design
employed as the part of framework to facilitate inter- of the masonry, robot control handles the brick either
active robotics. The first part of the project uses visual to the mason or directly in the bond pattern.
analysis and colour detection using a curated dataset To facilitate a collaborative design environment
from bricks of 3 major colour category; the second using different coloured bricks as a medium a phys-
part uses the data from the previous stage to control ical and computational design system was created.
the robot and the third part integrates the human in- This helped the design to be realised by a mason and
put into the robotic workflow breaking the determin- a robot with assistance of a vision and localization
istic nature of the workflow. aid.
This paper will present a computational frame-
work ( Fig. 1) for on-site robotic collaboration Physical Design System
based on computer vision, machine learning and The collaborative design environment was devel-
robotic control. Firstly, it will present computer vi- oped around a UR10 robot attached with a SMC
pneumatic gripper and 2 push buttons. The robot

606 | eCAADe 38 - D2.T10.S2. ROBOTIC TECTONICS, AUTOMATION AND INTERACTION - Volume 2


is placed at 320 mm from a table surface of 2100 x been used for design and collaboration. This approx-
1220 mm. In order to explicitly define the workspace, imate dimension help the collaborator to maintain a
the corners of the table has been marked with ArUco boundary between design space and material setup.
marker. These square fiducial markers)(Garrido- For the construction process bricks of DNF (Danish
Jurado et al., 2014) are used in for precise translation Standard Bond) format is used having dimension of
(or transformation) of the physical design space to a 228 X 108 X 54 mm. The brick stock has an uneven
digital design space with appropriate scaling refer- distribution of red, grey and yellow colour among
ence. This physical setup ( Fig. 2) is aided by an over- them. It is necessary to mention that due to a sky-
head Logitech Brio 4K webcam which conducts the light in the test environment the light condition had
visual analysis of the system. Due to this physical con- varied between warm and cool tones based on the
figuration, visual analysis is limited to XY plane of the season.
setup, mostly in two dimensions.
While facing the setup, approximately 700 mm Computational Design System
from the left is dedicated for brick placing and pick up The experiment used a combination of Rhino-
and rest of the table space under the boundary has Grasshopper and Python environment to describe

Figure 2
Experimental Setup
involving, robot,
end effector, vision
system and working
table

D2.T10.S2. ROBOTIC TECTONICS, AUTOMATION AND INTERACTION - Volume 2 - eCAADe 38 | 607


the computational design system. This computa- of the bricks. Some part of the code also uses the
tional system is designed into three subcategories python package URX (URX, 2020) to move and con-
viz. computer vision and visual analysis, geomet- trol the robot for the ease of use in the visual anal-
ric computation and robotic manipulation with state ysis [Link] brick designing algorithm identifies
machine. the colour of the brick based on the colour of nearest
Computer Vision and Machine Learning. OpenCV neighbours and colours available in the brick stock
(OpenCV, 2020) with Python binding has been used ( Fig. 3). The algorithm uses a heuristic solution for
a computer vision tool for this project due to its wide the colour solution rather than a stochastic solution,
range of applications, ease of access and large com- keeping in memory the available colours at a partic-
munity support. It helps translating the physical de- ular time. This implies that the solution might not be
sign space to a digital twin. It uses ArUco markers the most optimum one but the best one, given the
to determine the boundary of the work surface and availability of the bricks.
maps the physical existence of the table to a digi-
Figure 3
tal instance using perspective and distortion correc-
Robotic workplace
tion. Once it recognizes the white table surface it
setup and
subtracts the background to analyse the content of
calibration. View
the table viz, bricks, brick colour and position. On
from top mount
recognition of brick and its positions, a small machine
camera
learning model made using Tensorflow-Keras(Keras |
TensorFlow Core, 2020) is invoked to recognize the
colour of the brick and classify between, red, grey
and yellow category. This part of the design setup
is stored as separate python script files and invoked
from Grasshopper as a sub-process to run externally
Development of a collaborative system implies a cer-
to save time and memory. Each of these pieces of
tain amount of sensing from the machine part to col-
code are invoked by a state number and the out-
laborate with a human. Implementing this process
put is sent using a UDP protocol to a unique port in
through 2D image-based computer vision was a rel-
Grasshopper.
atively easy task compared to other sensing meth-
Geometric Computation and Robot Control. In the ods including 3D scanning of point cloud genera-
Grasshopper environment, on the design space, a tion. These methods are much more resource inten-
parametric brick wall is generated using native com- sive and in most of the cases require a special API or
ponents. The design of the wall has been kept sim- SDK to compute. Image based CV helps establishing
ple as the main goal was to experiment with the dis- the sensing using less resources and in less time. Due
tribution of the colour of the bricks using a human to the lighting situation in the test environment, the
like sensibility. Grasshopper allows the user to flow perceived colour of the brick differs in time in vari-
the data from left to right without an option to use ous seasons. This implies that it is not possible to
the generated data to feed into the previous process. identify the colours of the brick using an absolute
To overcome this issue a state machine(Finite State model. Implementation of a simple machine learn-
Machines, 2018) is also developed using grasshopper ing model helped overcoming this issue. For the ex-
add-on Metahopper(MetaHoppper, 2020) to control periment, four varying shades of yellow brick, fifteen
the flow and sequence of the [Link] robot varying shades of grey bricks and 13 varying shades
control is designed based on Vicente Soler’s Robot of red/orange bricks were procured from the material
framework to control the UR10 for pick and place library to create a colour recognition dataset. These

608 | eCAADe 38 - D2.T10.S2. ROBOTIC TECTONICS, AUTOMATION AND INTERACTION - Volume 2


were photographed using the same webcam in var- DESIGN EXPERIMENTATION
ious light condition. The labelled colour data is fed Robotic masonry has been under development since
into a simple supervised prediction model. Using this the last decade but in majority of the processes
model, the colour of the bricks is determined after the are either fully automated or leaves a very little
background subtraction. This increases the accuracy room for human intervention; mostly at an operator
and the detection and works in light conditions vary- level(Piškorec et al., 2019). The only intelligence used
ing from warm to cool. in the design process is the algorithmic intelligence.
The goal of this experiment is to collaborate with Specifically, in this project, it has been studied how
a robot to imitate human sensibility in a particular the human creative agency can be augmented with
task. Due to this reason the design of the wall has the algorithmic intelligence and computational de-
been kept simple but not generic to explore the pos- sign exploration (Fig. 4).
sibilities of robotic collaboration through actuation.
Figure 4 The majority of the setup is based on an interaction
Example of colour basis where the user interacts with the system using
variation in a two push buttons to assert the next step of the pro-
masonry work. cess. To create such behaviour within Grasshopper,
Photo by authors. state machine setup is used where each state is in-
voked on fulfilling a certain condition, completes a
particular task and triggers another state based on ei-
ther the output of that particular state or by human
override. This helps to control the interaction process
from one single environment. These state numbers
are also the same input that externally triggers the
python files in order to complete tasks based on vi-
sual communication.
The collaboration process does not start until cer-
The design of the masonry is developed incre-
tain predefined number of bricks of varying colour
mentally based on the number of bricks on the ta-
is placed by the human ( Fig. 5). These colour of
ble and their colours. The algorithm uses a simple
the bricks acts as the design driver (seed of the pat-
neighbouring rule set to create a colour distribution
tern) the further set of bricks based on the colour of
and variation. But it also improvises the assignment
the brick available on the table. This step is used as
of the colour based on the colour of the brick avail-
an augmentation to the creative agency by the hu-
able. This can be overridden by the collaborator and
man and reciprocation the behaviour is observed in
the input will be passed to the designed algorithm to
the due [Link] experiment is designed to gen-
predict on the colour. This gives an equal leverage to
erate a simple non-repeating pattern based on the
the human creativity also to a designed algorithmic
colour variation of the bricks. To achieve this goal, the
intelligence of the system.
human and the robot works together and both the
The robot acts as an active collaborator and
agencies’ decision are influenced by each other. Dur-
mostly acts as a helping “arm” to collaborate with hu-
ing the experiment the following points have been
man. The algorithm beneath the robotic manipula-
studied:
tion and pattern generation in real time provides the
placement planes and robot actuation is occurred • The material setup is done by the human collab-
through those planes. orator. Hence, the probability of the random-
ness in the colour distribution of the bricks could

D2.T10.S2. ROBOTIC TECTONICS, AUTOMATION AND INTERACTION - Volume 2 - eCAADe 38 | 609


vary at each step. The algorithms ability to gen- ing the colour in contrary to a computing the posi-
erate the randomness of the pattern was the first tion of the colour on based on Euclidean distance in
thing that was studied. Masons, through their a LAB colour space. The second neural network pre-
experience, can vary the colour of the bricks in dicted the best possible colour combination based
a bond to avoid repetition. It was observed how on a model which also reduces the time to compute
the system behaves through this change. the combination compared to a search algorithm.
• Following system generated results, it was ob- Particularly this has resulted into a lower waiting time
served how the mason behaved to the designed for the collaborator to keep the creative flow almost
pattern. It was also observed how much of the intact.
algorithmic pattern was accepted and retained
by the mason and what kind of influence it left Robotic Actuation
on the creative persona of the mason. The robot has been used as an agent to aid the hu-
• Finally, it was also studied by varying the robot man in the construction process. The accuracy of
speed at what speed and acceleration the ma- the robot has been used to realize the non-standard
son was most comfortable to collaborate with placement of the bricks in the masonry. The freedrive
the robot. mode of the UR10 also has been utilized to digi-
tize a brick in case of manual override of the brick
placement and to update the brick in the digital Figure 5
[Link] the conduction of the experiment the Experimental setup
system used an IDE( Fig. 6) and Grasshopper ( Fig. 7) after laying initial
as its main user interface. bricks and available
bricks on table
Figure 6
Background
subtraction and
prediction from one
of the neural
network

RESULTS
The result of the experiment yielded in insights into
the process. These results can be categorised into
two categories; (1) results owing to algorithmic intel-
ligence, (2) results due to robotic actuation.

Algorithmic Intelligence
Two neural network models have been utilized to au-
tomate and suggest the task. First one was to deter-
mine the colour and second one was used to predict
the possible colour of the brick for a combination.
First one, reduced the time of a process of identify-

610 | eCAADe 38 - D2.T10.S2. ROBOTIC TECTONICS, AUTOMATION AND INTERACTION - Volume 2


The vision system and the prediction system used the solution to overcome or augment that bias is to im-
IDE and Grasshopper was used for the design and plement a recurrent neural network (RNN) to predict
robot control. This lacked a unifying user interface the pattern using the semantics of the masonry pat-
to for the collaborator to concentrate on. The deci- tern. Collaboration between human and robotic arm
sion of the robot was not available to the mason till with aid of neural networks has considered two agen-
the point of robotic actuation. This possibly created a cies working together in order to achieve a design
sense of disconnect with the robotic system and the goal. In this scenario the robotic arm, augmented
physical design system. with a computer vision system, contributes to the
brick handling and pattern suggestion process. The
Figure 7 machine learning model used in the process, can be
Colour of the brick biased with the data available at a point of time based
to be placed(in on the availability of the bricks. The decisive and ex-
blue) is to be haustive nature of the creative tasks, leaves the pro-
decided by its 6 cess with many variables. This presence of many vari-
nearest ables contributes to the higher computation time to
neighbouring interrupt the design flow. In a collaborative process
bricks and available of this nature, it will be efficient for human to han-
brick on the table. dle the decisive and exhaustive creative nature of the
process while robot handles the more precise and an-
alytical aspect of the task.
The present goal in this demonstrator was to cre-
ate a random pattern using the available bricks. But
DISCUSSION it is possible to replace the evaluation criteria of the
Following this study of human robot collaboration placement of bricks with thermal or acoustic perfor-
through a vision system, some important aspects of mance of the wall where the creative agency of the
design collaboration and technical improvement has human can be collaborated with the analytical and
been revealed. computational model of the machinic system.
This experiment is a model of an on-site con- The experiment has attempted to carry out a
struction collaboration. This has been conducted in foundational task on which several on site appli-
a controlled lab environment which is usually differ- cation and creative collaboration can be based. It
ent than a construction site. The identification of will be important to investigate the common ground
the brick and the pickup zone is hard coded into of the designer delivering a creative solution, algo-
the system using markers and easy detectable white rithms and autonomous systems generating a per-
background. In the construction site, it could be still formative solution and role of robotics- augmenting
marked by a marker but the brick detection can be the human co-creator by additional dexterity and de-
done using a semantic segmentation. This is help- grees of freedom.
ing the model to the utilized in a various construction
scenario. ACKNOWLEDGEMENTS
The pattern prediction model has been trained Authors would like to thank Boligfonden Kuben for
on a supervised data set for the time being with 2430 their support in this research. They also would like to
possible combinations generated by another algo- thank Poul Lund and Peter Skotte from CREATE Pro-
rithm and curated by a designer. This will reflect a de- totyping team for their assistance in setting up the
signer’s bias on the pattern generation. The possible physical aspect of experimentation.

D2.T10.S2. ROBOTIC TECTONICS, AUTOMATION AND INTERACTION - Volume 2 - eCAADe 38 | 611


REFERENCES
Dorfler, K, Rist, F and Rust, R 2013 ’Interlacing’, Rob | Arch
2012, Vienna, pp. 82-91
Dubor, A, Camprodom, G, Diaz, GB, Reinhardt, D, Saun-
ders, R, Dunn, K, Niemel”a, M, Horlyck, S, Alarcon-
Licona, S, Wozniak-O’Connor, D and Watt, R 2016,
’Sensors and Workflow Evolutions: Developing a
Framework for Instant Robotic Toolpath Revision’, in
Reinhardt, D, Saunders, R and Burry, J (eds) 2016,
Robotic Fabrication in Architecture, Art and Design
2016, Springer International Publishing, Cham, pp.
410-425
Elashry, K and Glynn, R 2014, ’An Approach to Au-
tomated Construction Using Adaptive Programing’,
in McGee, W and Ponce de Leon, M (eds) 2014,
Robotic Fabrication in Architecture, Art and Design
2014, Springer International Publishing, pp. 51-66
Garrido-Jurado, S, Munoz-Salinas, R, Madrid-Cuevas, FJ
and Mar, MJ 2014, ’Automatic generation and detec-
tion of highly reliable fiducial markers under occlu-
sion’, Pattern Recognition, 47(6), pp. 2280-2292
Ingold, T 2013, Making: Anthropology, Archaeology, Art
and Architecture, Taylor & Francis
Johns, RL, Anderson, J and Kilian, A 2020 ’Robo-Stim:
Modes of Human Robot Collaboration for Design Ex-
ploration’, Impact: Design With All Senses, Cham, pp.
671-684
Moorman, A, Liu, J, Sabin, JE, Jenny, A/ and Studio, S 2016
’Context-Dependent Robotic Design Protocols and
Tools’, Proceedings of the 36th Annual Conference of
the Association for Computer Aided Design in Architec-
ture (ACADIA)
Piskorec, L, Jenny, D, Parascho, S, Mayer, H, Gramazio,
F and Kohler, M 2019 ’The Brick Labyrinth’, Robotic
Fabrication in Architecture, Art and Design 2018,
Cham, pp. 489-500
[1] [Link]
[2] [Link]
[3] [Link]
[4] [Link]
x
[5] [Link]
[6] [Link]
[7] [Link]

612 | eCAADe 38 - D2.T10.S2. ROBOTIC TECTONICS, AUTOMATION AND INTERACTION - Volume 2

You might also like