Existing heating, ventilation, and air-conditioning systems have difficulties in considering occupants’ dynamic thermal needs, thus resulting in overheating or overcooling with huge energy waste. This situation emphasizes the importance of… Click to show full abstract
Existing heating, ventilation, and air-conditioning systems have difficulties in considering occupants’ dynamic thermal needs, thus resulting in overheating or overcooling with huge energy waste. This situation emphasizes the importance of occupant-oriented microclimate control where dynamic individual thermal comfort assessment is the key. Therefore, in this paper, a vision-based approach to estimate individual clothing insulation rate ( $$I_{\rm{cl}}$$ ) and metabolic rate (M), the two critical factors to assess personal thermal comfort level, is proposed. Specifically, with a thermal camera as the input source, a convolutional neural network (CNN) is implemented to recognize an occupant’s clothes type and activity type simultaneously. The clothes type then helps to differentiate the skin region from the clothing-covered region, allowing to calculate the skin temperature and the clothes temperature. With the two recognized types and the two computed temperatures, $$I_{\rm{cl}}$$ and M can be estimated effectively. In the experimental phase, a novel thermal dataset is introduced, which allows evaluations of the CNN-based recognizer module, the skin and clothes temperatures acquisition module, as well as the $$I_{\rm{cl}}$$ and M estimation module, proving the effectiveness and automation of the proposed approach.
               
Click one of the above tabs to view related content.