0% found this document useful (0 votes)
4 views

Occupancy Visualizationtowards Activity Recognition

Uploaded by

Saim
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

Occupancy Visualizationtowards Activity Recognition

Uploaded by

Saim
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Occupancy Visualization towards Activity Recognition

Alexander Tessier1, Simon Breslav1, Kean Walmsley1, Michael Lee1, Hali Larsen1, Jacky Bibliowicz1, Pan Zhang1,
Liviu-Mihai Calin1, Bokyung Lee2, Josh Cameron1, Rhys Goldstein1, Azam Khan1

1 2
Autodesk Research, Toronto, ON, Canada. Department of Industrial Design, KAIST, Daejeon
{firstname.lastname}@autodesk.com Republic of Korea. bokyunglee@kaist.ac.kr

ABSTRACT Sustainable Development Goals (SDGs). In terms of the built


environment, Goal 11.c mentions the need for building sustainable
We present a sensor visualization system that integrates data and resilient buildings, especially for the least developed
streams from individual custom sensor arrays together with countries, so that these nations do not simply follow the actions of
Building Automation System (BAS) data. To help bridge the gap the developed world [1]. In the U.S., buildings are the dominant
between actual building usage by the occupants, and the aggregate cause of Green-house Gases (GHG) at 2.2 billion metric tons
assumed usage by the control system, we have developed several accounting for 40% of American CO2 emissions [2]. The end use
sensor processing subsystems moving toward automated human of energy consumption in buildings is primarily space heating
activity recognition without the need for directly instrumenting (37%), water heating (12%), space cooling (10%), and lighting
the occupants. By having a system with a detailed understanding (9%). As the major factors are all related to the actual usage of the
of occupancy behavior and needs, we believe buildings could be buildings, the gap between the operation of a building and the
much more efficient thereby reducing energy consumption, actual needs of the occupants is a potentially large opportunity for
working toward sustainability of the built environment. energy and water reduction.
Previous work has shown that occupant behavior can impact
CCS CONCEPTS building energy consumption up to 23.6 percent [4] based on
• Computing methodologies~ Activity recognition and simulation sensitivity analysis. The frequency and nature of these
understanding; • Human-centered computing~ Visualization behaviors has yet to be established and sufficiently modeled.
toolkits • Human-centered computing~ Visual analytics; • Many techniques exist to detect various activities, such as position
Information systems~ Temporal data [8, 5] and health from gait [6], but none adequately cover the
contexts needed for efficient control. To study the potential of
KEYWORDS sensor fusion in developing novel device-free systems attached to
Visual Analytics; Time-series Data; Human Activity Recognition; infrastructure (buildings, bridges, etc.), we have created a
Computer Vision visualization system that can interactively help researchers
discover correlations between systems before investing heavily in
ACM Reference format: programming or machine learning training. By using multiple
Alexander Tessier, Simon Breslav, Kean Walmsley, Michael Lee, Hali sensor types with computer vision and combining this with the
Larsen, Jacky Bibliowicz, Pan Zhang, Liviu-Mihai Calin, Bokyung Lee,
rich meta-data environment of Building Information Models
Josh Camer, Rhys Goldstein and Azam Khan. 2019. Occupancy
(BIM), sensors and sensing systems can be evaluated to visualize
Vsiualization towards Activity Recognition. In The 1st ACM International
Workshop on Device-Free Human Sensing (DFHS'19). ACM, New York, potential synergies and interactions.
NY, USA, 4 pages. https://doi.org/10.1145/3360773.3360877

2 Sensor Visualizations
1 Introduction We have developed Dasher360, a web-based sensor
As environmental sustainability becomes a global priority, key visualization tool that displays current sensor values, in the
areas of concern have been collated in the United Nations context of building geometry, and can animate historical data for
temporal analysis. Implementation of the visualization techniques
Permission to make digital or hard copies of all or part of this work for personal or described in this paper have been done using JavaScript as an
classroom use is granted without fee provided that copies are not made or distributed
for profit or commercial advantage and that copies bear this notice and the full citation
extension to Autodesk Forge Viewer [3], a web-based
on the first page. Copyrights for components of this work owned by others than ACM visualization framework to display BIM data.
must be honored. Abstracting with credit is permitted. To copy otherwise, or Figure 1 shows visualization of CO2 sensors, where sensors are
republish, to post on servers or to redistribute to lists, requires prior specific
visualized as green sensor dots in 3D space on the BIM. Clicking
permission and/or a fee. Request permissions from Permissions@acm.org.
on a sensor dot opens a 2D plot of CO2 readings. By having the
DFHS'19, November 10, 2019, New York, NY, USA sensors positioned within the context of the 3D geometry, spatial
© 2019 Association for Computing Machinery. occupancy patterns in the building can be shown evolving over
ACM ISBN 978-1-4503-7007-3/19/11…$15.00
https://doi.org/10.1145/3360773.3360877
time as CO2 levels change, albeit in an abstract indirect way. The

14
DFHS'19, November 10, 2019, New York, USA A.Tessier et al.

sensor dot visualization is implemented using a particle system


which can render thousands of interactive dots in a layer above
the 3D building geometry.

Figure 3: Computer Vision Time-series Tagging.

3 Video Annotations
Our PIR motion detector could only generate data when doors
Figure 1: Time-series CO2 data from BAS sensors. were opened making it necessary to annotate the count of people
on the bridge using the video data. To reduce the workload in
To further validate CO2 readings, we visualize CO2 sensor labeling data, we used a simple Histogram of Oriented Gradients
readings as a spatial heatmap (Figure 2). The heatmap shading (HOG) [11] human detector and applied it to each image in the
uses Shepard’s Method [2] to perform multivariate interpolation video to create a crude occupancy sensor. In this way, we could
through a GLSL Shader. The timeline on the bottom of the provide a base-layer of automation for recording when actual
window can be used to select a time period to play back changes occupants were on the bridge in our dataset. This provided a count
in the heatmap over time. On top of the heatmap, icons of human of people, and a bounding box in the video field (see Figure 4).
figures indicate occupant positions based on prototype infrared We then processed this to generate a time-series of human
(IR) sensors developed by Schneider Electric. In the office occupancy.
dataset, we can observe the correlation between higher CO2 levels
with the presence of occupants in the closed meeting room in the
bottom left of Figure 2.

Figure 4: Computer Vision Video Annotation (green outlines


on video)

4 Pose Estimation
Figure 2: Temperature and CO2 heatmaps combined with Human activities and behaviors can be simplified and
local infrared sensors showing the specific placement of aggregated into average, schedule-based behaviors as in [4].
individuals. However, how these behaviors are affected by specific design
features and how those relate back to performance may not
In a manufacturing workshop, in addition to CO2, we correlate well since the details can be lost in the aggregation
instrumented a pedestrian walkway using an array of strain process [9]. To study these interactions and generate more fine-
gauges, accelerometers, sound, temperature, humidity, passive grained behavioral observations, we employed Pose Estimation
infra-red motion (PIR), pressure and ambient light sensors. At [7]. Context specific behaviors, such as pausing in the middle of
both ends of the bridge, we placed video cameras. We processed the bridge to observe surroundings, carrying objects, or walking in
the video using computer vision to more easily discover when to groups can only be annotated with poses as in Figure 5.
look for events (occupancy events shown as blue ticks in the
timeline in Figure 3), and reference as ground truth.

15
DFHS'19, November 10, 2019, New York, USA A.Tessier et al.

be used for precise indoor positioning together with activity


recognition. This, in turn, can be used for (a) precision HVAC and
lighting control increasing building performance, and (b) human
safety and security in both normal and emergency scenarios. The
former use case will be critical in low-energy or net-zero
buildings [12] where maximal efficiency is needed continuously
both to meet the immediate needs as well as conserve resources
for use over seasonal periods.
Uniquely, this work positions the sensed data in the context of
a Building Information Model. This extra information adds
enough context to make the data meaningful to the observer. Also,
this work focusses on making the data directly accessible through
Figure 5: Computer Vision Pose Estimation (OpenPose) [7] visualization, supporting a data-driven exploratory analysis
process, rather than providing information that has been
By calibrating cameras with markers and using homography, aggregated in space and/or time. This is a critical point as this
we can map the location of the 2D poses in 3D and provide a process can help form new hypotheses rather than only
relative position of the plane upon which the poses lie within the confirming the existence of features.
camera frustum. This positioning provides better spatial context,
as seen in Figure 6, and can help determine the load positions of
people on the bridge structure. Along with the homography,
tracking can be added by classifying heads and following footfalls
to create continuous and stable positions for individual poses
across frames [10].

Figure 7: Skeleton-based video annotation system [9]

6 Conclusion and Future Work


Figure 6: Pose Estimation Streaming into Visualization Tool The work presented has spanned multiple projects over several
years, working towards a more precise, complex, yet automated
As actions can be complex, and group oriented, standard understanding of occupancy in buildings. The key contribution of
annotation tools and techniques can be extended to create a this work is to present a comprehensive visual analytics system
context aware and relevant labeling system for use in specific that can interactively help researchers validate and understand
situations, such as collaborative interactions in an office various raw sensing data in context of BIM, prior to any
environment [9] (see Figure 7). These annotations can be aggregation. By connecting multiple visual analytics methods, we
leveraged in supervised machine learning training applications, believe this work will form a solid foundation to support the
along with sensor data and on top of pose recognition for context- development and validation of equally precise simulation models
aware activity recognition. Note that anonymization is supported that can help to create highly efficient model-based controls as
to some degree by overlaying the extracted stick-figure pose well as design tools for architects and engineers to develop
“skeletons” on a 3D model or on a frame of the source video buildings that do not generate any greenhouse gases.
where no people are present.
REFERENCES
[1] UN General Assembly, Transforming our world: the 2030 Agenda for
5 Discussion Sustainable Development, 21 October 2015, A/RES/70/1
[2] U.S. Department of Energy. (2012). 2011 Buildings Energy Data Book.
By collecting data from a variety of environmental sensors, [3] Autodesk Forge Viewer, https://forge.autodesk.com/
together with detailed pose information from video sensors, the [4] Azar, E., & Menassa, C. C. (2012). A comprehensive analysis of the
correlated data sets could provide several benefits. The data can impact of occupancy parameters in energy simulation of office buildings.

16
DFHS'19, November 10, 2019, New York, USA A.Tessier et al.

[5] Lv, J., Yang, W., & Man, D. (2017). Device-free passive identity
identification via WiFi signals. Sensors, 17(11), 2520.
[6] Juen, J., Cheng, Q., Prieto-Centurion, V., Krishnan, J. A., & Schatz, B.
(2014). Health monitors for chronic disease by gait analysis with mobile
phones. Telemedicine and e-Health, 20(11), 1035-1041.
[7] Cao, Z., Hidalgo, G., Simon, T., Wei, S. E., & Sheikh, Y. (2018).
OpenPose: realtime multi-person 2D pose estimation using Part Affinity
Fields. arXiv preprint arXiv:1812.08008.
[8] Savazzi, S., Rampa, V., Vicentini, F., & Giussani, M. (2015). Device-free
human sensing and localization in collaborative human–robot
workspaces: A case study. IEEE Sensors Journal, 16(5), 1253-1264.
[9] Lee, B., Lee, M., Zhang, P., Tessier, A., Khan, A. (2019). An Empirical
Study of how Socio-Spatial Formations are influenced by Interior
Elements and Displays in an Office Context. To appear in in ECSCW
2019: Proceedings of the 18th European Conference on Computer
Supported Cooperative Work, London, UK. Springer, Cham.
[10] Villaggi, L., Stoddart, J., Zhang, P., Tessier, A., Benjamin, D. (2019).
Design Loop: Calibration of a Simulation of Productive Congestion
through Real-World Data for Generative Design Frameworks. To appear
in Computational Design Modeling: Proceedings of the Design Modeling
Symposium Berlin 2019. Springer Science & Business Media.
[11] Dalal, N., & Triggs, B. (2005, June). Histograms of oriented gradients for
human detection.
[12] Aelenei, L., Aelenei, D., Goncalves, H., Lollini, R., Musall, E.,
Scognamiglio, A., Cubi E. and Noguchi, M. “Design Issues for Net Zero-
Energy Buildings,” Open House International, vol. 38, no. 3, pp. 7-14,
2013.

17

You might also like