Researchers at The University of Osaka have developed a novel framework for measuring occupancy in open-plan offices with unprecedented precision. This innovative system uses computer vision and AI to analyze occupancy at a micro-scale level, focusing on specific functional zones within the office.
This addresses a significant gap in current occupancy tracking methods, which typically only provide macro-level data and struggle to capture detailed usage patterns within shared spaces.
The research addresses a critical need for more detailed and accurate occupancy data in open-plan offices. Traditional methods often fail to capture the complexities of how these spaces are used, leading to inefficiencies in resource allocation and design.
This new framework provides a practical and cost-effective solution for gathering granular occupancy data, which can inform evidence-based decisions about office design and management. This can lead to more sustainable, efficient, and user-friendly workspaces. The work is published in the Journal of Building Engineering.
The researchers used existing CCTV cameras and 3D pose estimation to create a computer vision system that accurately measures micro-scale occupancy within specific functional zones of open-plan offices. The system analyzes video footage to track individuals and classify their location within predefined zones, aggregating this data to reveal occupancy patterns.
Real-world testing validated the system’s accuracy, providing valuable insights into how employees use different office areas. The findings can inform decisions regarding office layout, resource allocation (lighting, heating, cleaning), and energy management, ultimately contributing to more efficient and sustainable workspaces.
-
System design framework and field validation. (a) The overview of the design framework for occupancy measurement systems. (b) Arrangement of the experimental space and camera installation: The floor plan is divided into three micro-zones using ground markers. Data collection from four cameras is conducted via remote control. Credit: Journal of Building Engineering (2025). DOI: 10.1016/j.jobe.2025.113037
-
Core components and outputs of the occupancy measurement system. (a) The overview of 3DPE and 2D projection extraction. (b) Occupancy status (19:58:40–19:59:39). (c) Occupancy status (19:59:40–20:00:39). (d) Occupied and vacant duration (19:58:40–20:00:39) (e) Variations in the ratio of occupied zones by count and area (19:58:40–20:00:39). (f) Variations of occupancy frequency (19:58:40–20:00:39, Measurement). (g) Variations of occupancy frequency (19:58:40–20:00:39, GT). Credit: Journal of Building Engineering (2025). DOI: 10.1016/j.jobe.2025.113037
Sihua Chen, a doctoral candidate involved in the research, highlighted the interdisciplinary nature of the project, bridging environmental engineering and computer science to solve real-world challenges. She emphasized the potential of this technology to fill a gap in existing occupancy measurement techniques and provide data-driven support for sustainable design and operation of indoor open-plan office spaces.
This research has significant implications for the future of workplace design. By providing accurate and detailed occupancy data, the framework enables data-driven optimization of office layouts, resource allocation, and energy control, leading to more sustainable and productive work environments.
More information:
Sihua Chen et al, Development of an occupancy measurement system for micro-zones within open office spaces based on multi-view multi-person 3D pose estimation, Journal of Building Engineering (2025). DOI: 10.1016/j.jobe.2025.113037
Citation:
AI-powered occupancy tracking system optimizes open-plan office design (2025, July 14)
retrieved 15 July 2025
from https://techxplore.com/news/2025-07-ai-powered-occupancy-tracking-optimizes.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.