• Nenhum resultado encontrado

IoT-based monitoring platform for indoor thermal comfort data collection

N/A
N/A
Protected

Academic year: 2021

Share "IoT-based monitoring platform for indoor thermal comfort data collection"

Copied!
163
0
0

Texto

(1)

Universidade de Aveiro Departamento de Eletrónica,Telecomunicações e Informática

2019

Ana Laura

Costa Almeida

Plataforma de monitorização em IoT de recolha de

dados sobre o conforto térmico interior

IoT-based monitoring platform for indoor thermal

comfort data collection

(2)
(3)

Universidade de Aveiro Departamento de Eletrónica,Telecomunicações e Informática

2019

Ana Laura

Costa Almeida

Plataforma de monitorização em IoT de recolha de

dados sobre o conforto térmico interior

IoT-based monitoring platform for indoor thermal

comfort data collection

Dissertação apresentada à Universidade de Aveiro para cumprimento dos requisitos necessários à obtenção do grau de Mestre em Engenharia de Computadores e Telemática, realizada sob a orientação científica do Prof. Doutor Diogo Nuno Pereira Gomes, Professor auxiliar do Departamento de Eletrónica, Telecomunicações e Informática da Universidade de Aveiro, e do Prof. Doutor Mário Luís Pinto Antunes, Professor Adjunto Convidado da Uni-versidade de Aveiro.

(4)
(5)

“Welcome to real life. It sucks! You’re gonna love it.”

(6)
(7)

o júri / the jury

presidente / president Prof. Doutor Arnaldo Silva Rodrigues de Oliveira professor auxiliar da Universidade de Aveiro

vogais / examiners committee Prof. Doutor Diogo Nuno Pereira Gomes professor auxiliar da Universidade de Aveiro (orientador)

Prof. Doutora Ana Cristina Costa Aguiar

(8)
(9)

agradecimentos /

acknowledgements Em primeiro lugar, gostaria de agradecer ao meu orientador, Prof. DoutorDiogo Gomes, pelo apoio e pelo acompanhamento prestados ao longo deste desafio.

Gostaria também de agradecer ao Instituto de Telecomunicações da Univer-sidade de Aveiro pela oportunidade e pelo auxílio prestados.

Quero ainda agradecer aos meus pais por me terem dado a oportunidade de frequentar o ensino superior. Sem eles, nada disto seria possível.

Por fim, gostaria de agradecer à minha família e aos meus amigos que me apoiaram e motivaram durante todo o meu percurso académico, em especial àqueles que enfrentaram comigo os momentos mais difíceis sem deixarem de acreditar em mim.

(10)
(11)

Palavras Chave coleção de dados, conforto térmico, Internet das Coisas, monitorização de dispositivos, recolha de dados sensoriais, sistema de alertas.

Resumo Estima-se que, atualmente, grande parte da energia produzida seja consu-mida por edifícios do setor residencial para efeitos de controlo climático. Com o objetivo de desenvolver soluções de forma a reduzir o consumo de energia associado ao controlo climático sem comprometer o conforto dos re-sidentes, vários estudos têm sido desenvolvidos a nível mundial relacionados com o comportamento dos habitantes e o seu nível de conforto. É nesse âm-bito que se insere esta dissertação.

Esta dissertação propõe-se a recolher um dataset sobre níveis de conforto térmico em ambiente doméstico, com base no clima mediterrânico e na rotina típica das famílias portuguesas. Mais tarde, esse dataset pode ser estudado de forma a extrair informação útil sobre que tipo de soluções se podem desen-volver com o objetivo de reduzir o consumo energético enquanto se mantêm os níveis de conforto já praticados.

Para recolher esse dataset, determinados parâmetros do ambiente interior das casas foram recolhidos durante um longo período de tempo através de sensores ZigBee, como, por exemplo, temperatura, humidade e deteção de movimento.

Essas medições foram enviadas e guardadas numa plataforma IoT, cons-truída com base em software open-source, que monitoriza o processo de recolha dos dados e gera alertas quando deteta uma falha que não possa ser corrigida sem intervenção manual.

O envio das leituras dos sensores ZigBee para a plataforma IoT é feito através de uma gateway, que age como um intermediário na comunicação e imple-menta funcionalidades necessárias para o processo de monitorização. O estudo permitiu a recolha de informação em 15 casas, durante, aproxima-damente, 8 meses, e ainda se encontra a decorrer. Foi ainda realizada uma análise inicial sobre os dados recolhidos, do ponto de vista da qualidade do dataset obtido e do processo de recolha desenvolvido.

(12)
(13)

Keywords dataset, thermal comfort, IoT, device monitoring, sensor data gathering, alert system.

Abstract Considering the noticeable climatic changes from past years, a general inter-est in finding solutions to reduce the environmental footprint has been grow-ing.

Among the major sources of energy consumption in domestic buildings, cli-mate control is placed at top positions. In order to achieve a balance between the energy consumption levels and the occupants’ thermal comfort, several studies have been conducted all over the world regarding users’ behavior and their thermal comfort perception. It is within this scope that this dissertation addresses comfort monitoring.

The study proposed in this project focuses on retrieving thermal comfort data based on the Portuguese families lifestyle and the Mediterranean climate. This dissertation aims to collect a good quality dataset that can be later analysed in order to extract useful information about how to develop effective solutions that are able to reduce the energy consumption levels without compromising the thermal comfort levels that householders are used to.

In order to collect the dataset, certain long-term indoor measurements were collected, such as temperature, humidity and motion detection, through Zig-Bee sensors. These measurements were sent and stored in an open-source IoT platform, responsible for monitoring the data collection process and gen-erating alerts when it detects a failure that cannot be solved without user in-tervention.

The measurements are sent from the sensors to the IoT platform through a gateway, which acts as an intermediary for protocol translation and also im-plements monitoring functionalities.

As a result of the study, data from 15 houses were collected during, approx-imately, 8 months, but the study is still running. The resulting dataset has undergone preliminary analysis, for the quality obtained and the developed data collection process point of view.

(14)
(15)

Contents

Contents i

List of Figures vii

List of Tables ix Glossary xi 1 Introduction 1 1.1 Motivation . . . 1 1.2 Objectives . . . 2 1.3 Structure . . . 2

2 State of the art 3 2.1 Data collection methods . . . 3

2.2 IoT Platforms . . . 4

2.2.1 Generic Architecture . . . 5

2.2.2 Requirements . . . 6

2.2.2.1 Scalability . . . 6

2.2.2.2 Heterogeneity and Interoperability . . . 6

2.2.2.3 Flexibility . . . 6

2.2.2.4 Availability and Performance . . . 6

2.2.2.5 Security and Privacy . . . 7

2.2.2.6 Robustness and resilience . . . 7

2.2.2.7 Summary . . . 7

2.2.3 Application Domains . . . 7

2.2.3.1 Application Development Platforms . . . 8

2.2.3.2 Device Management Platforms . . . 13

2.2.3.3 Data Visualization Platforms . . . 15

(16)

2.2.4.1 Open Source VS Proprietary software/hardware . . . 19

2.3 User participation in thermal comfort case studies . . . 19

2.4 Related work . . . 20

2.4.1 Comparison . . . 27

2.5 Summary . . . 30

3 Architecture 31 3.1 Data collection scope . . . 31

3.2 Technological decisions . . . 32

3.2.1 IoT Platforms . . . 32

3.2.2 Sensors . . . 32

3.2.2.1 Sensor types regarding mobility . . . 33

3.2.2.2 Sensor communication technology: wired vs wireless . . . 33

3.3 Requirements . . . 34

3.3.1 Scalability . . . 34

3.3.2 Heterogeneity and Interoperability . . . 34

3.3.3 Flexibility . . . 34

3.3.4 Security and Privacy . . . 34

3.3.5 Availability . . . 35

3.3.6 Robustness, resilience, and failure handling . . . 35

3.3.7 Data storage . . . 35

3.3.8 Cost-effectiveness . . . 36

3.4 Functionalities . . . 36

3.4.1 Device management . . . 36

3.4.2 Monitoring system and alerts . . . 36

3.4.3 User Interaction . . . 37 3.4.4 Automated Provisioning . . . 37 3.5 Overview . . . 37 3.5.1 Sensor nodes . . . 38 3.5.1.1 Gateway . . . 38 3.5.2 Device Manager . . . 39 3.5.3 Data Receiver . . . 40 3.5.4 Monitoring System . . . 40 3.5.5 Communication Channel . . . 40 3.5.6 Feedback Handler . . . 40 3.5.7 Data visualization . . . 41 3.6 Summary . . . 41

(17)

4 Implementation 43

4.1 Decisions taken and adopted technologies . . . 43

4.1.1 Data to collect . . . 43

4.1.1.1 Thermal comfort perception, indoor temperature, and relative humidity 43 4.1.1.2 Occupants’ presence . . . 44

4.1.1.3 Weather forecast . . . 44

4.1.2 IoT Platforms . . . 45

4.1.3 Sensing devices and gateway . . . 46

4.1.3.1 Communication protocols . . . 46

4.1.3.2 Protocol translation . . . 50

4.1.3.3 Sensing Devices . . . 51

4.1.3.4 Gateway device . . . 52

4.1.3.5 Gateway OS . . . 54

4.1.4 Device management and monitoring . . . 56

4.1.4.1 System Updates . . . 56

4.1.4.2 Alert System . . . 56

4.1.4.3 Log files upload . . . 58

4.1.5 Thermal comfort perception feedback . . . 59

4.1.6 User communication channel . . . 59

4.1.6.1 Facebook Messenger vs Hubot . . . 60

4.1.6.2 Webpage . . . 60

4.1.7 Integrity and robustness . . . 62

4.1.7.1 Read-only filesystem . . . 62

4.2 Functional Description . . . 62

4.2.1 Yocto . . . 63

4.2.2 Registration Process . . . 64

4.2.3 Zigbee2mqtt . . . 67

4.2.3.1 File structure and configuration . . . 68

4.2.3.2 Authentication . . . 69

4.2.3.3 MQTT topics . . . 69

4.2.3.4 Relevant changes in code . . . 69

4.2.3.5 Flow and debugging . . . 70

4.2.3.6 Integration with user UI: pairing a device . . . 70

4.2.4 Gateway web server . . . 71

4.2.5 Weather sensor . . . 72

4.2.6 Facebook Messenger chatbot . . . 73

(18)

4.2.6.3 Webhooks event processing . . . 74

4.2.6.4 Chatbot subscription and feedback . . . 75

4.2.7 Feedback handler . . . 76 4.2.8 Monitoring system . . . 78 4.2.8.1 Watchdog . . . 78 4.2.8.2 Alerta . . . 79 4.2.8.3 Alerts list . . . 81 4.2.8.4 Creating an Alert . . . 82 4.2.8.5 Heartbeats . . . 84

4.2.8.6 Plugins and custom behaviors . . . 84

4.2.9 Software Updates . . . 87

4.2.9.1 Update packages . . . 87

4.2.9.2 Update flow . . . 88

4.3 Establishing contact with the participants of the study . . . 89

4.3.1 First user approach: study presentation . . . 89

4.3.2 Second user approach: installation details . . . 89

4.3.2.1 Installation details . . . 89

4.3.2.2 Installation Instructions . . . 90

4.3.2.3 Paring Instructions . . . 90

4.3.2.4 Facebook Messenger Chatbot subscription . . . 91

5 Evaluation and Results 93 5.1 Solution validation . . . 93 5.1.1 Deployment Scenario . . . 93 5.1.2 ZigBee demonstration . . . 96 5.1.2.1 Initialization . . . 96 5.1.2.2 Pairing process . . . 97 5.1.2.3 Measurements publication . . . 99 5.1.3 Webpage demonstration . . . 99 5.1.3.1 Pairing mode . . . 100

5.1.3.2 Chatbot subscription demonstration . . . 101

5.1.4 Thermal comfort feedback demonstration . . . 103

5.1.5 Update system demonstration . . . 105

5.1.6 Logs Uploading demonstration . . . 106

5.1.7 Alert system demonstration . . . 107

5.1.7.1 Dashboard . . . 107

5.1.7.2 Heartbeats . . . 108

(19)

5.2 Study Results . . . 109

5.2.1 Participants profile . . . 110

5.2.2 Data collected . . . 110

5.2.3 Alerta events . . . 112

5.3 Discussion of results . . . 113

5.3.1 Worst performance gateways . . . 113

5.3.2 Best performance gateways . . . 113

5.3.3 Thermal comfort collected data . . . 114

5.3.4 Door sensor collected data . . . 114

5.3.5 Top generated alerts . . . 114

5.3.5.1 Door sensor is not publishing . . . 115

5.3.5.2 Weather sensor is not publishing . . . 116

5.3.5.3 Lost Connection to MQTT server . . . 116

5.3.5.4 zgb2mqtt is not running . . . 116

5.3.6 Weather Location . . . 118

5.3.7 Participants’ opinions regarding to the study . . . 118

6 Conclusion 119 6.1 Future work . . . 120

References 123 Appendix A : Instructions Manual provided to the participants . . . 128

Appendix B : Details of the study provided to the participants . . . 132

Appendix C : Consent form provided to the adult participants . . . 134

Appendix C : Consent form provided to the under age participants . . . 135

(20)
(21)

List of Figures

2.1 SCoT architecture diagram based on Hono middleware . . . 12

3.1 General overview of the proposed architecture . . . 38

4.1 CC2531 USB sniffer . . . 50

4.2 Xiaomi Aqara Motion Sensor . . . 51

4.3 Xiaomi Aqara Temperature and Humidity Sensor . . . 51

4.4 Xiaomi Aqara Door and Window Sensor . . . 52

4.5 NanoPi-NEO2 layout . . . 54

4.6 Alerta Dashboard example . . . 58

4.7 General overview of the solution implemented . . . 63

4.8 Registration process components from the architecture diagram . . . 66

4.9 Registration process sequence diagram . . . 67

4.10 Data collection process architectural diagram . . . 68

4.11 Architectural components from paring process . . . 71

4.12 Architectural weather components . . . 73

4.13 Architectural components involved in the feedback retrieval process . . . 78

4.14 Architectural components belonging to the alert system . . . 80

4.15 Flowchart associated to the Alerta plugin developed . . . 85

4.16 Architectural components belonging to the software update system . . . 88

5.1 Deployment diagram of the IoT Platform . . . 94

5.2 Xiaomi door sensor installation (door closed/door open) . . . 94

5.3 Xiaomi temperature and motion sensors installation . . . 95

5.4 Gateway installation . . . 95

5.5 Gateway webpage . . . 100

5.6 Webpage pairing mode . . . 101

5.7 VM webpage for Facebook Messenger Chatbot subscription . . . 101

(22)

5.10 Chatbot quick-reply message for thermal comfort feedback retrieval . . . 104 5.11 Participants responses to the thermal comfort survey . . . 104 5.12 SWUpdate log for update demonstration . . . 105 5.13 Demonstration update result . . . 106 5.14 Ftp directories for each tenant . . . 106 5.15 Logfile names . . . 107 5.16 Alerta dashboard . . . 107 5.17 Alerta Heartbeats tab . . . 108 5.18 Chatbot message asking for rebooting the gateway . . . 109 5.19 Chatbot message asking for waking up the door sensor . . . 109 5.20 Total percentage of data collected per gateway-ID . . . 110 5.21 Total data collected by each sensor from each gateway . . . 111 5.22 Total percentage of data collected by each sensor type . . . 112 5.23 Total percentage of each alert generated . . . 112 5.24 Total percentage of sensor related alerts of each gateway-ID . . . 116

(23)

List of Tables

2.1 Advantages and disadvantages of several data collection methods . . . 3 2.2 IoT Platforms comparison - Part 1 . . . 17 2.3 IoT Platforms comparison - Part 2 . . . 18 2.4 Related work Comparison - Part 1 . . . 28 2.5 Related work Comparison - Part 2 . . . 29

4.1 Communication Protocols comparison . . . 49 4.2 Feedback Handler logic associated . . . 78

(24)
(25)

Glossary

AC Air Conditioning

AES Advanced Encryption Standard API Application Programming Interface ASHRAE American Society of Heating,

Refrigerating and Air-Conditioning Engineers

BLE Bluetooth Low Energy BMS Building Management Systems CASAS Center for Advanced Studies in

Adaptive Systems CLI Comand-line Interface

CoAP Constrained Application Protocol DNS Domain Name System

DNS-SD DNS Service Discovery DoS Denial-of-service

DTLS Datagram Transport Layer Security FTP File Transfer Protocol

GSN Global Sensor Networks GUI Graphical User Interface

GW Gateway

HTTP Hypertext Transfer Protocol HTTPS Hypertext Transfer Protocol Secure IBM International Business Machines IoT Internet of Things

IPMA Instituto Português do Mar e da Atmosfera

JSON JavaScript Object Notation LAN Local Area Network

LoRa Low Range

LSM Linked Sensor Middleware M2M Machine-to-machine mDNS Multicast DNS

MQTT Message Queuing Telemetry Transport NoSQL Not Only SQL

NTP Network Time Protocol OS Operating System OTA Over-the-air

PAN Personal Area Network PoE Power over Ethernet PMV Predicted Mean Vote PSID Page-Scoped ID

REST Representational State Transfer RDF Resource Description Framework RPC Remote Procedure Call

RPM Red Hat Package Manager SCoT Smart Cloud of Things SMTP Simple Mail Transfer Protocol

SN Sensor Node

SQL Structured Query Language TCP Transmission Control Protocol TLS Transport Layer Security UDP User Datagram Protocol UI User Interface

URL Uniform Resource Locator

VM Virtual Machine

VoIP Voice over IP

WLAN Wireless Local Area Network WSGI Web Server Gateway Interface WSU Washington State University XML Extensible Markup Language XMPP Extensible Messaging and Presence

(26)
(27)

CHAPTER

1

Introduction

1.1 Motivation

Energy production and consumption are responsible, directly or indirectly, for some of the major damages to the environment, due to the emissions of polluting gases that contribute to the increase of the greenhouse effect [1].

A noticeable increase in environmental awareness has created several strategies in order to reduce the human’s environmental footprint.

Buildings account for approximately 40% of global energy consumption and it is predicted to continue growing worldwide in the coming decades. Three-quarters of total energy consumption in the buildings sector is residential, where there is great potential to improve energy efficiency [2]. Besides appliances and illumination, climate control (heating and cooling) represents one of the largest responsible for residential buildings’ energy consumption.

In Portugal, the domestic sector is responsible for about 26% of the total energy consump-tion [3], in which 22% of that was directed to climate control[4].

Climate control is directly related to indoor thermal comfort, which is a very important factor that house occupants do not intend to abdicate. Thermal comfort is a condition of the mind that expresses satisfaction with the thermal environment [5]. In this regard, several studies have been conducted in the last decades in order to reduce buildings’ energy consumption without compromising indoor thermal comfort ([6], [7], [7], [8]). Some researchers even have made publicly available the data they collected during the study period (like, for example, WSU CASAS [9] or DomusLab [10]), which allows other researchers to analyze the collected data from different perspectives and draw their own conclusions, or simply use the published datasets to evaluate their developed thermal comfort models and compare the results.

However, the conclusions inferred from one study should not be applied to a case study with different conditions because data may not be properly aligned with the problem. Different studies can output very different datasets, considering that buildings have different dynamics,

(28)

In addition, not all datasets from thermal comfort studies are released, due to data privacy policies or the lack of quality of the dataset, reducing the number of scenarios covered, which constitutes another major drawback in the study and evaluation of thermal comfort, delaying the development of energy consumption reduction strategies.

Therefore, it was not possible to create a general solution to reduce energy consumption at home yet. It is within this context that this dissertation positions itself.

1.2 Objectives

This dissertation intends to conduct an indoor thermal comfort study based on the Portuguese families’ lifestyle and the Mediterranean climate.

It does not aim to reduce energy consumption immediately, by analyzing the data in real-time and suggesting actions or behavioral adaptations. On the contrary, it will only focus on creating an efficient and correct method to collect data about house families thermal comfort. As long as the built dataset has good quality, the comfort-related data can be later analyzed in order to develop new technologies and strategies to reduce energy consumption whilst maintaining the same levels of comfort.

Besides collecting a quality dataset, another major concern in this project is to protect the anonymity and confidentiality of the participants’ data. Considering that this study involves sensitive data, it is important to prevent unauthorized disclosure of their personal data.

1.3 Structure

This dissertation is organized in 6 chapters, being one of them the Introduction chapter which was just presented. The remaining 5 chapters are described as follows:

• Chapter 2: In the second chapter, the state of the art is presented, which contains a brief description of the key concepts generally used in this thematic and detailed information regarding solutions, available and developed to the date, related to the context in question.

• Chapter 3: In chapter 3, based on the knowledge obtained from the previous chapter, a solution to achieve the goal of this dissertation is proposed. A possible architecture for the solution is described, as well as the general requirements and functionalities associated with it.

• Chapter 4: In the fourth chapter, the practical implementation of the solution proposed in chapter 3 is presented in detail, justifying the decisions taken during the process. • Chapter 5: In this chapter, the results obtained from the practical implementation of the

proposed solution are presented and submitted to analysis, from where some conclusions are inferred.

• Chapter 6: The last chapter summarizes the solution developed and validates it against the general requirements presented in chapter 3. It ends with some suggestions for possible improvements and future work.

(29)

CHAPTER

2

State of the art

2.1 Data collection methods

Nowadays, data can be collected using several existing methods. Some examples are obser-vation, interviews and focus groups, journalling, surveys, online monitoring, crowd-sourcing, sensing, etc. Each one has its advantages and disadvantages, as it is summarized in Table 2.1.

Table 2.1: Advantages and disadvantages of several data collection methods

Data collection methods Advantages Disadvantages

Observation, Interviews

and focus groups It allows to retrieve very com-plete and precise data related to the thematic in question.

It is very time consuming and require heavy intervention from the participants, which can be exhausting for them. Surveys and journalling It is easy to deploy for the

re-searchers. Also require heavy interven-tion from the participants, which can be exhausting for them.

Crowdsourcing and

On-line monitoring The data collection processcan be automated. It allows to collect a lot of data related to the topic in question.

Requires few or even none user intervention. Once lots of data are collected, cleaning the dataset can be hard and time consuming for the researchers.

Sensing It allows automating the data

collection process, providing precise measurements while re-quiring few or even none user intervention. The device sen-sors used can be very inexpen-sive, although capable of pro-viding precise measurements.

Sensing devices can be fragile and, depending on the situa-tion, deployment costs can be expensive. In addition, once the users do not control data that is being collected, they can feel uncomfortable about this process.

(30)

target population, the type of data required to collect, as well as the performance indicators to analyze it, and even the time and resources available [11]. For instance, if the study is about a deep and complex topic, which requires more detailed data, a personal interview might provide a more detailed response than a questionnaire. Or if the study is headed to the elderly community, a less technological method might be more adequate. On the other hand, if the study implies volunteered participants, the least routine-disturbing and intrusive method should be chosen so that the participants do not feel bothered.

In the case of the study requiring to collect data for a long period of time or from a large number of sources, probably a cost-efficient automated method, based on sensors and Internet of Things (IoT) platforms, is more suitable than one that requires labor work, because the automatic readings reporting facilitates the acquisition of the data, especially when sensors are in higher numbers. For the same reasons, crowdsourcing and online tracking are very common as well. The other methods, like observation, interviews, surveys, and journaling, though they can use more or less technology, usually require more human effort than the three mentioned before.

Thermal comfort studies usually consist of collecting measurements of indoor and outdoor environments and comparing them against occupants’ thermal comfort perception.

In [12], the authors stated that there are two main methods to collect data for the evaluation of thermal comfort: measurements of indoor parameters, and application of thermal comfort surveys. These two methods can be used simultaneously in order to obtain maximum information about thermal comfort monitoring and improve the analysis, once they complement each other.

The first method involves physical monitoring of the building and the exterior, nowadays performed by sensors and IoT platforms, due to the last two decades of technological advances. This method allows collecting objective data, i.e., data that does not depend on user opinion or interaction, which translates the actual environment state. In fact, in this case, the least human intervention required to collect the data, the better, so that data cannot be delayed or manipulated by the occupants. This establishes one of the many reasons why sensors and IoT platforms should be used to collect environment data: it is an automatic process.

Thermal comfort surveys gather occupants’ perceptions and preferences about thermal comfort. This is considered subjective data because it expresses their preferences for thermal comfort and differences between individuals. It is usually harder than the first one to obtain because it requires occupants intervention. Although the surveys are clear, concise and easy to answer, at least when they follow the best practices, they still depend on human predisposition to answer.

2.2 IoT Platforms

Sensor devices and IoT platforms have proven to be efficient in data collection, as it will be verified later in the projects presented in section 2.4.

(31)

concept in domestic buildings. IoT stands for Internet of Things and it is characterized by having a large number of devices sharing data among them, with or without user intervention. These things can be as small and simple as sensors, smartphones, smartbands, laptops, appliances, or as complex as servers, clouds, etc.

Thus, an IoT platform (often referred to as IoT middleware) is no more than a software that enables connection between those things. It acts as a middleware in order to connect hardware devices to software platforms so that different devices that, a priori, weren’t able to communicate between them, can connect.

IoT platforms are also responsible for monitoring the connected devices and then acquire, process, organize and store the data produced by them [13]. Usually, they also allow remote data access and visualization in several available devices (laptops, smartphones,...), anywhere, anytime.

2.2.1 Generic Architecture

A generic architecture for a data collection IoT platform can be divided into two major components: the data acquisition layer, and the cloud platform.

The sensors are responsible for the data acquisition part and for sending the data to the platform. Sensors can be physical or virtual [14]. Physical sensors are sensors that use physical devices to directly interact with the world and measure parameters of the physical environment, like the temperature or humidity of a room. Virtual sensors can also take measurements and produce relevant data for the platform, but use software processes instead of physical devices.

When sensors are not able to communicate with the cloud by themselves, they can be connected to a gateway. The gateway translates messages between different protocols and forward communication between the sensors and the cloud. Depending on its processing capabilities, the gateway can execute some pre-processing, such as data aggregation, before sending the collected data to the cloud. The collected data is then sent to the cloud. The cloud can perform more or less functionalities, depending on the final objective of the platform. These functionalities can be centered in one single cloud or multiples clouds can be integrated, dividing or replicating the responsibilities.

In general, a cloud platform is responsible for receiving the collected data, by supporting different protocols and data formats to ensure communication with the most varied data sources. The received data can suffer some processing before being stored, allowing to trigger events, alarms or actions.

The cloud can also manage connected devices. The devices must register into the platform so that only trustworthy devices can send data. Other security mechanisms can and should be implemented. The device management also ensures the devices are working properly and can include remote updates functions.

The platform can also support data analytics, which performs a range of complex analysis over the stored data, from basic data clustering and deep machine learning to predictive

(32)

At last, the platform should provide means to access and see the collected data, such as visualization dashboards [15].

However, different architectures also exist that slightly vary from the architecture described above, such as [16], [17], [18].

2.2.2 Requirements

A well designed IoT platform should fulfill the generic requirements presented in the next sections, which also represent some challenges that IoT platform designers are still trying to overcome. [19]

2.2.2.1 Scalability

An IoT platform should be able to expand with the increase of connected devices and data flow. The system must scale vertically, for storage purposes, and horizontally, for processing support [20]. The architecture of the platform must be planned from the beginning to, not only allow the addition of more storage capacity, communication, and processing capabilities, but also support new communication protocols and different devices.

2.2.2.2 Heterogeneity and Interoperability

The increase in the number of devices connected to IoT platforms in the last decades has also increased the heterogeneity of resources, capabilities, communication protocols, data formats, standards and functionalities a platform has to handle. However, as mentioned before, an IoT platform serves to facilitate the connection between devices, so it must ensure they can co-exist and inter-operate, despite their differences.

2.2.2.3 Flexibility

An IoT platform should be flexible in order to fit the user’s needs. IoT users usually need customized dynamic solutions, which means they need platforms that can be deployed in different systems, that can remain simple for less complex scenarios, or that can increase its complexity by adding new features without having to restructure the entire architecture. 2.2.2.4 Availability and Performance

Once IoT platforms have spread throughout so many important sectors, like medical care, citie’s supplies systems, building management, government security, and due to technological advances in the IoT sector, there has been an increasing demand for IoT platforms to be always available. In some cases, a fault in the platform availability can lead to great economic losses or even put human lives in danger. Hence, it is mandatory to attain clients’ expectations and keep the platform’s uptime and communication availability close to 100%.

Performance can also degrade with scalability, however, IoT platforms should remain fast in processing data and answering requests, as well as avoiding any kind of delay.

(33)

2.2.2.5 Security and Privacy

Security and data privacy are key concerns in IoT because IoT platforms usually deal with sensitive data, which should be protected from any kind of unauthorized disclosure (data confidentiality), from being lost, destroyed, corrupted or modified (data integrity) [21], and should guarantee that can be used in dedicated time (data availability).

There are some well-known security mechanisms that can be implemented in IoT platforms, like encryption algorithms, the establishment of secure connections, firewalling, access control rules, device authentication procedures, secure booting process, and others. The security challenges arise when devices are too simple or have limited resources, preventing the use of complex security mechanisms. Besides, excessive security can affect devices and/or user interaction with the platform. That is usually the reason why security is considered a trade-off between the level of protection and the degree of usability.

2.2.2.6 Robustness and resilience

Robustness is another field of security. There is no point in implementing complex security mechanisms if the platform is not prepared to handle external flaws, erroneous inputs or execution errors, which can cause security breaches. These problems tend to accentuate with the increase of the scalability of the platform.

In addition, IoT platforms should also be capable of automatically recover from failures. 2.2.2.7 Summary

There are some requirements that a well designed IoT platform should comply with. First, the IoT Platform should be scalable, in order to support business growth. Secondly, it should provide a large variety of resources and capabilities in order to embrace the heterogeneity of components that increases with the expansion of the business. In addition, the IoT platform should remain flexible in order to allow being adaptable to different work environments and user needs. However, despite business growth, the availability and performance of the IoT platform must not downgrade. Finally, the data handled by the platform should be protected and its privacy secured, which should be attained through, not only security mechanisms, but also resilience against execution failures and external flaws.

2.2.3 Application Domains

IoT platforms can be divided into categories, based on their application domains, i.e., their main purpose and capabilities. In [22], the author defined 10 categories: Application development, Device Management, System Management, Heterogeneity Management, Data Management, Analytics, Deployment Management, Network Monitoring Management, Data visualization, and Research. Although the categories are well defined, it is possible platforms from one category having features from another. According to the scope of this thesis, only three categories will be further analyzed. For each type, it will be presented some examples of

(34)

existing platforms. Those platforms can be integrated into IoT projects to build IoT solutions faster, cheaper and better.

2.2.3.1 Application Development Platforms

Application development platforms are considered a one-size-fits-all approach that offers users everything they need to get an IoT system off the ground [23]. They provide means to develop and deploy IoT solutions in only one platform. These platforms usually provide features like data collection methods, device management system, data analysis algorithms, data storage, and data visualization capabilities. The users can implement the complete platform or only some features, according to their needs.

KAA

KAA1 is a very complete IoT platform because it covers most of the requirements mentioned before. In a brief way, the platform allows collecting data from devices, manage and control them, process and analyze the collected data, and visualize it.

KAA design is based on a customizable microservice architecture that promotes scalability, flexibility and third-party integration. The microservices run in docker containers, and users can replace any of them with one of their own. This type of architecture is easily expandable, by simply adding more servers to the KAA cluster. The implementation of Kubernetes helps to efficiently coordinate the clusters and recover from failures.

KAA supports multiple connectivity protocols, like Low Range (LoRa), Wi-Fi, Bluetooth Low Energy (BLE), Ethernet, mobile connection (2G/3G/4G), for device connection to the platform and lightweight IoT protocols for messaging, such as Message Queuing Telemetry Transport (MQTT) and Constrained Application Protocol (CoAP). These protocols ensure communication over either persistent or intermittent network connections. However, the platform is, in fact, transport-agnostic and may support any IoT protocol, which means users can use a custom-tailored transport protocol for establishing communication between their devices and the system.

The communication between the devices (or gateway) and the platform is secured with Transport Layer Security (TLS) or Datagram Transport Layer Security (DTLS). In order to connect to the platform, a device has to present valid credentials, such as pre-shared keys, tokens, login and password combinations, certificates, etc. These credentials are managed by the platform. Regarding the authentication, devices authenticate the server by checking its TLS certificate. The client-side authentication is divided into different levels, from none at all to checking the devices TLS certificates.

If the devices used have limited resources, preventing them to implement security mech-anisms or support any of the platform communication protocols, KAA provides a gateway architecture to enable connectivity between the device and the platform. The gateway acts as

1

(35)

an intermediary between the two parties, performing some messaging conversion in order to establish communication.

The gateway is also useful in data collection to ensure data delivery because it allows buffering data collected by the devices locally and resend it to the server in case of data loss or an error occurrence. This capability relies on response codes, and the gateway only deletes the data when receives a valid one. KAA platform allows to collect both structured and unstructured data, like plain text, numbers, key-value maps, arrays, or nested objects. The raw collected data is organized into well-structured time-series for convenient analytics and visualization. Time-series allow consumers to listen to new data points and trigger particular actions based on those. Due to its modular architecture, KAA allows to easily integrate various databases or data analytics systems.

The data visualization component of Kaa comprises a rich set of widgets, such as gauges, charts, maps, tables, etc. But this IoT platform has other purposes besides the classic data collection, analysis and visualization It also supports device management and device control.

Device control allows checking the room temperature, change the thermostat temperature or open a door, for example. It is achieved by executing commands or sending messages to the devices, either synchronously or asynchronously. The asynchronous option is useful for resource-consuming commands because the caller does not have to block, waiting for the command execution to finish. Instead, the platform will notify it about the result.

Device management is responsible for registering the devices, storing detailed information about them, like location, mac address, or other attributes, which allows being grouped by their characteristics. It also controls the device connection, validating their credentials, and reliable delivery of software updates to them, by utilizing the confirmation response codes sent by devices upon the update result. Unfortunately, KAA is neither free nor an open-source IoT platform. Its subscription plans can go from $250/month to $1,000/month, and their free trial period is only for 30 days.

ThingsBoard

ThingsBoard2 is an open-source IoT platform for data collection, processing, visualization, and device management. It claims to be scalable, fault-tolerant, robust and efficient, and customizable.

This platform can start small, using a monolithic deployment, or scale to microservices to ensure high availability. The amount of supported server-side requests and devices increases linearly as new Thingsboard servers are added in clustering mode.

ThingsBoard supports the standards IoT protocols - MQTT, CoAP and Hypertext Trans-fer Protocol (HTTP) - for communication. Secure communications are obtained through transport encryption for both MQTT and HTTP protocols. The platform also requires device authentication and provides device credentials management.

(36)

In order to connect non-IP or constrained devices, an IoT gateway architecture is provided, that includes an Application Programming Interface (API) to integrate IoT devices connected to legacy and third-party systems with the platform. One can connect to ThingsBoard Gateway through external MQTT broker or OPC-UA server3, but other protocols may be used by implementing custom extensions. The gateway is responsible for collecting data from the devices and convert/adapt messages between them and the platform. It also allows local data persistence which guarantees data delivery in case of network and hardware failures. In addition, associated with data replication in the cloud, this feature ensures reliable data collection and storage. It supports Java, Python, Go, C/C++ and other languages.

ThingsBoard supports three database options to store data: Structured Query Language (SQL) database (PostgreSQL by default), Not Only SQL (NoSQL) database (Cassandra is the only supported) and hybrid (Stores all entities data, like devices, assets, customers, dashboards, etc, in SQL database and all telemetry data, like attributes, time-series sensor readings, statistics, events, in NoSQL database).

The data can be accessed and/or visualized using customizable widgets, real-time dash-boards or server-side APIs. The dashdash-boards are easily built using a drag-and-drop editor and can be assigned to multiple customers. Any user will be able to see and remote control only their devices and will not have any access to other users’ data. The remote control is ensured by bi-directional communication with the devices, through Remote Procedure Call (RPC) commands. Thingsboard also includes the ability to register and manage connected devices and defined alarms. Real-time alarms monitoring includes raising an alarm when a certain event occurs, like a device being disconnected or inactive.

The rule engine is considered the core of the ThingsBoard platform. It allows to filter, enrich and transform system events and devices telemetry as well as trigger actions, like create alarms or push filtered data to the external message queue for advanced analytics.

The ThingsBoard Community Edition is a free, open-source version that supports unlimited devices, assets and software updates. The difference between the free community edition and the paid Professional Edition is that the last one includes more features like third-party platforms integration (Amazon AWS, International Business Machines (IBM) Watson, Microsoft Azure, OceanConnect, Sigfox, etc), white-labeling (change logo and color scheme), and even scheduling various types of events (like report generation, commands to devices and configuration updates), or create entities groups (of device, assets, etc) to simplify administration tasks.

OpenIoT

OpenIoT [24] is an open-source platform for data collection, data storage, and device manage-ment that combines principles from two projects, such as the Global Sensor Networks (GSN) and the Linked Sensor Middleware (LSM) projects.

3OPC Unified Architecture (OPC UA) is a machine-to-machine communication protocol for industrial

(37)

The architecture of the platform comprises 7 elements with well-defined roles. The first one is the Sensor Middleware, which is responsible for collecting data and performing some preprocessing it, like filtering. It is based on X-GSN, which is an extended version of the GSN middleware that supports semantic annotation of both sensor data and metadata. This element also contains a mobile broker (a publish/subscribe middleware) that is used for the integration of mobile sensors. The Mobile Broker stays between the sensors/users and the cloud. It registers the sensor in the Cloud Data Storage, announcing the type of data it can publish. This mechanism ensures that only relevant data is pushed into the cloud, and subsequently to be transmitted in near real-time to adequate mobile devices.

The second element is the Cloud Data Storage. This LSM implementation, which has been re-designed with push-pull data functionality and cloud interfaces, acts as a cloud database that enables storage of data streams stemming from the sensor middleware. It also stores metadata required for the operation of OpenIoT.

The Scheduler is the architectural element that processes the requests for on-demand deployment of services and manages proper access to the resources they require (for example, data streams). It discovers sensors and associated data streams that can contribute to a given service. It also manages the service and activates the resources involved in its provision. The requests are formulated and submitted to the scheduler by the Request Definition component, which is supported by a Graphical User Interface (GUI). To deliver the data streams, processed by the scheduler, to the requested service there is a Service Delivery & Utility Manager. This component is associated with the Request Presentation component, which is in charge of the visualization of the outputs of a service.

At last, there is also a Configuration and Monitoring element to enable visual management and configuration of functionalities over sensors and services.

This platform allows collecting data from any sensor in the world, including physical devices, sensor processing algorithms, social media processing algorithms and more. In the OpenIoT concept, the term sensor refers to any components that can provide observations. It can be an aggregation or computation over other virtual sensors or even represent a mathematical model of a sensing environment.

In order to propagate its data to the rest of the OpenIoT platform and to allow that other applications and users can discover them and get access to their data, each sensor needs to register within the LSM. The sensor is registered through X-GSN by posting a semantically annotated representation of its metadata. The X-GSN takes care of creating the semantic annotations in Resource Description Framework (RDF), according to the OpenIoT ontology, and posting them to the LSM cloud store repository.

After its registration, the sensor is available for discovery and querying from the upper layers of the OpenIoT architecture. Data acquisition for each virtual sensor is achieved based on wrappers that collect data through serial port communication, User Datagram Protocol (UDP) connections, HTTP requests and more.

X-GSN implements wrappers for these data providers and allows users to develop custom ones.

(38)

Data are represented as streams of data tuples that can be consumed, queried or analyzed online. LSM provides a wide range of interfaces (wrappers) for accessing sensor readings such as physical connections, middleware APIs, and database connections. Each wrapper is pluggable at runtime so that wrappers can be developed to connect new types of sensors into a live system when the system is running. Queries over Linked Stream Data are continuous, which means they are continuously executed as new data arrives, with new results being output as soon as they are produced.

OpenIoT user management, authentication, and authorization are performed by the privacy & security module (Central Authentication Service - CAS) that implements the OAuth 2.0 protocol. Users are redirected to a centric login page the first time they try to access a restricted resource where they provide their username and password to the central authentication entity. If authentication is successful, the CAS redirects the user to the original web page and returns a token to the web application. Tokens represent authenticated users, have a predefined expiration time and are valid only before they expire. The token is forwarded from one service to the next one in a request chain, e.g., from the user interface to LSM. Services can check if the token is valid, or use the token to check if the user represented by this token has the necessary access rights.

SCoT

Smart Cloud of Things (SCoT) stands for Smart Cloud of Things and it consists of an IoT/Machine-to-machine (M2M) platform developed by the Telecommunications Institute of the University of Aveiro, that focuses on communicating with IoT devices, and processing, storing and visualizing data.

This platform can collect data and events directly from IoT devices or through gateways. Only registered devices and gateways are able to send data to the platform.

The architecture diagram of this platform can be seen in figure 2.1.

(39)

This IoT platform is deployed in a Docker Swarm environment to promote easy integration of several applications, as well as fast, secure and reliable processing.

It also uses Eclipse Hono which consists of a service that acts as a middleware to connect large numbers of IoT devices to a backend, regardless of the device communication protocol.

Some components from Hono architecture that are worth to mention are [25]:

• the Device Registry instance, that manages registration information and issues device registration assertions to protocol adapters;

• the apache QPID Dispatch Router instance, that supplies telemetry data and events from devices to downstream applications;

• the Apache ActiveMQ Artemis instance, that acts as persistent storage for events; • the Prometheus instance for storing metrics data from services and protocol adapters; • a Grafana instance that provides a dashboard for visualizing the collected metrics data.

When a device tries to connect to one of Hono’s protocol adapters, the protocol adapter first tries to authenticate the device using information kept in the Device Registry. This means that, before a device can connect to Hono and publish any data, the corresponding information needs to be added to the Device Registry, constituting the device registration process.

Also, Hono supports common authentication mechanisms like username/password and X.509 client certificates to verify a device’s identity and uses TLS when communicating with devices.

The collected data is consumed from Hono by the RabbitMQ component, which sends them to the other components from the SCoT platform, like a database for persistent storage. The databases that this platform includes are Apache Cassandra, PostgreSQL, and InfluxDB, but the one responsible for storing the data collected by the devices is the first one. Hono supports devices communicating via common IoT protocols like HTTP, and MQTT and even provides a simple mechanism to add custom protocol adapters.

2.2.3.2 Device Management Platforms

Device management platforms specialize in handling and managing IoT devices. They ensure that everything is connected and secure, report metrics, keep an updated status of the devices, and notify users about changes in them. These platforms can also be responsible for updating the devices’ firmware.

UpSwift

UpSwift [26] [27] consists of a GUI based management interface that enables IoT and embedded devices’ update, management, control, and diagnosis.

The users can register their devices through the UpSwift dashboard and, then, manage them. In order to connect the device with the UpSwift cloud, it is necessary to download/install the UpSwift-client in the device. This requires that the device match some requirements: Linux

(40)

need to be installed as well, which means that the client software won’t work with init.d based Linux OS and it may be challenging to run in Red Hat Package Manager (RPM)-based Linux OS’s.

After the client software is installed in the device, it starts syncing data with the Cloud, sending keep-alive status messages, checking for new updates, a new remote connection or changes in project parameters.

The interaction between the devices and the cloud back-end is based on a client/server architecture, with a pull-based mechanism, that allows remote control and access to the device without compromising its security. In addition to the platform security mechanisms, the communication between the client and the server is established over secured Representational State Transfer (REST) API’s and all of the device’s data are encrypted on disk with Advanced Encryption Standard (AES)-256 encryption algorithm, with the decryption keys stored on separate machines.

The platform also fetches application logs from the edge device and allows continuously monitoring the application and device status in real-time through the dashboard.

The platform still has a smart diagnostic tool that is capable of automatically detect suspicious or unused behaviors form the devices and send an email to alert the administrator.

Nevertheless, the key feature of this device management system is the support for Over-the-air (OTA) updates to the connected devices. The client software looks for updates in a configurable frequent interval and executes the changes and updates. This platform also allows executing pre & post-install commands along with the update-deployments and provides the option to roll-back to the previous state in case of update failure.

The pricing plans are available upon registration and vary with the number of devices added to the platform.

Thinger.io

Thinger.io4 is another open-source platform that allows connecting and managing IoT devices. It aims to reduce the complexity of connecting a device to an IoT platform, providing a ready to use scalable cloud infrastructure for connecting things, that allows users to monitor and control their devices without worrying about the required cloud infrastructure.

The server consists of a multithreaded server, coded in C++ with ASIO techniques for ensuring maximum performance while consuming fewer resources, which will provide better scaling required in the IoT domain. It can be deployed in any architecture, like x86, amd64 or arm64. However, the Thinger.io IoT platform requires a MongoDB server for storing some server information, so it is recommended to use 64-bit architectures, as the MongoDB database is limited to 2GB of data in 32 bits systems.

The server handles all the things connections, providing authentication and communica-tion[28]. All device resources can be accessed from a REST API. The communication with the devices is based on Transmission Control Protocol (TCP) and TLS protocols and an

4

(41)

encoding protocol specifically designed in this project for microcontrollers and devices with small memory capabilities, named protoson. This protocol can be directly transcoded to and from JavaScript Object Notation (JSON), so all the data transfer from/to the devices through the REST API interface is done in JSON. This ensures saving communication bandwidth in the device’s side while allowing maximum interoperability with external clients like web or mobile devices.

All interactions with connected devices need to be authenticated against the platform, using an access token.

It is possible to configure an Simple Mail Transfer Protocol (SMTP) Server for sending emails through the endpoints.

In the other end of this project are the client libraries, that are compiled in the devices to allow exposing them to the Internet: Arduino Client and ARM mbed client. These libraries have been designed thinking in microcontrollers, so they can be compiled both on small microcontrollers like Arduinos, and more powerful devices like ESP8266, Raspberry Pi, Intel Edison, etc. However, all the code is C++, so it can be compiled in almost every system. This endorses the hardware-agnostic characteristic of this platform. The libraries use the already mentioned protoson encoding for the communications with the server, which reduces memory and bandwidth requirements compared to other encoding techniques like JSON, Extensible Markup Language (XML) or string parsing. The client libraries also allow calling configurable endpoints in the cloud infrastructure for sending emails, sending HTTP/HTTPS requests, and so on.

The cloud console is a back-end console that is responsible for devices, endpoints, data buckets, and access tokens management. This console is linked to a front-end dashboard that allows easy information visualization.

The thinger.io platform provides a mobile app to monitor the devices, visualize their data in real-time through charts, update their resources, and check its status. It also presents a CLIMASTICK board for purchase, that integrates WiFi connectivity along with a set of powerful sensors to provide environmental and motion sensing. It can be configured using the Arduino IDE.

The disadvantages of this platform are that the free version only supports a maximum of 2 devices, the data will be stored in a shared cloud and also only includes community support. For more than 100 devices, an isolated cloud and custom support, the payments start at $199/month.

2.2.3.3 Data Visualization Platforms

Data visualization platforms are responsible for displaying data to the users. They organize the data through graphical or table formats, or dashboards. Some dashboards and data visualization tools will be discussed below, though it is important to distinguish its purpose. While some dashboards provide telemetry data visualization, others support the data collection process and display the data flow.

(42)

Dash (Plotly)

Dash is an open-source library, released under the permissive MIT license. Plotly develops Dash and offers a platform for easily deploying Dash apps.

Dash is a framework for building data visualization web apps written on top of Flask, Plotly.js, and React.js. Dash applications are no more than web servers running Flask and communicating JSON packets over HTTP requests. Then, Dash’s frontend renders components using React.js and plotly.js libraries [29].

The data source should be an SQL database.

Grafana

Grafana [30] is an open-source analytics & monitoring solution for databases. It is commonly used for monitoring data in real-time. Besides visualizing the data, it also allows to query, generate alerts and send notifications to systems like Slack, explore logs, and understand the metrics.

Grafana provides support for many different data sources: InfluxDB or Prometheus are examples of time series databases available; for relational databases, one can easily plug to MySQL or PostgreSQL databases (or TimescaleDB). Indexes are also available via the ElasticSearch connector. [31]

2.2.4 IoT Platforms Comparison

The information about the IoT Platforms described in the last section is summarized in tables 2.2 and 2.3.

These platforms are only several examples of the already existing IoT Platforms in the market. They facilitate the implementation of IoT projects by consisting of full IoT Platforms that are ready to deploy or of IoT Platforms that are easy to integrate and provide specific features to an IoT Platform already built.

(43)

Table 2.2: IoT Platforms comparison - Part 1

Platform Category Scalability Heterogeneity, In-teroperability and

Flexibility Security

KAA App development scalable (microser-vices, doker, ku-bernets) LoRA, Wi-fi, BLE, Ether-net, 2G/3G/4G; MQTT, CoAP, custom; gateway; TLS/DTLS; cer-tificates

Thingsboard App development scalable (microser-vices) MQTT,HTTP; gateway;CoAP, transport encryp-tion OpenIoT App development scheduler;ports scalability of

sup-sensor networks;

virtual sensors or

physical devices OAuth 2.0 proto-col

SCoT App development scalable (microser-vices, doker, cas-sandra clusters) Apache Cassan-dra, InfluxDB and PostgreSQL; HTTP, MQTT devices authen-tication (user-name/password) and TLS commu-nications

Upswift Device Manage-ment

limited (requires Linux OS, apt package manager and systemd)

secure communi-cations and data storage encryp-tion with AES-256 (decryption keys stored on separate machines)

Thinger.io multithreadedserver

recommended to use 64-bit archi-tectures; REST

API;

hardware-agnostic; Inte-grate with third parties Platforms

and custom

programs

TLS; protoson; ac-cess token for au-thentication;

Dash

(Plotly) Data visualization

join data from

multiple data

sources

data source: SQL

database HTTP Basic Authand Plotly OAuth

Grafana Data visualization MixedSources Data

data sources: InfluxDB, Prometheus, MySQL or PostgreSQL, TimescaleDB, ElasticSearch connector OAuth; username /password by de-fault; limit data source URL

(44)

Table 2.3: IoT Platforms comparison - Part 2

Platform Robustness / Re-silience Datation Visualiza- Price Other KAA Kubernets; dataretransmission gauges,maps, tables, etccharts,

not free, not

open-source (free trial period for 30 days)

time-series

Thingsboard

local data persis-tence guarantees data delivery in case of network failures; data replication in the cloud customizable widgets, real-time dash- boards or server-side APIs

free and open-source version

users can access their data and remotely control their devices; alarm system; rule engine OpenIoT facilitate a ser-vice presentation in a Web 2.0 inter-face, for maps and graphs

free, open-source

SCoT Docker CQLGrafana queries, free

Upswift Rollback in case offailure dashboard, maps not free; diagnosealert system;and ova-updates; Thinger.io dashboard; mobileapp; charts open-source, freeversion (limited

re-sources) Dash

(Plotly) prevents code in-jection

Dashboards, inter-active graphs, plu-gins to visualize metrics and logs, lists, charts, ta-bles

open-source library

Grafana prevents code in-jection

Dynamic

Dash-boards, graphs, plugins to vi-sualize metrics and logs, lists, charts, tables, annotations

(45)

2.2.4.1 Open Source VS Proprietary software/hardware

Along with those IoT Platforms, other turn-key solutions for monitoring indoor parameters are also already available in the market (Develco 5, Wulian6, Sensiron7, Fierce8, Climax 9, Centralite 10, 4Noks11, for example).

However, and as one could conclude from the last sections, they are often based on proprietary hardware or software.

As [17] stated, in general, commercial solutions tend to surface many features. However, they are mainly closed source and implement proprietary technologies instead of standardized ones. Most of them also do not provide open APIs for the general public, which could enable quick adoption and development of generic solutions. In addition, most of the time, products from different companies or manufacturers are incompatible among them, which makes communication and integration with each other a challenge. Furthermore, many of them automatically upload data to the company cloud, which, apart from rising privacy issues, requires to integrate another IoT Platform in case developers want to process or manage the data by themselves. Apart from that, although commercial platforms provide many features to their clients, they also are restrictive in terms of personalization according to the users’ needs.

Therefore, proprietary software and hardware solutions, besides compromising the flexibility and functionality of the system, delay the platform deployment and increase its costs [32], [33]. On the side of the open-source solutions, they provide basic features and are not quite as advanced. But they offer a basis for further development and possible realization of generic solutions.

Considering that this is the Era of Internet of Things and that there has been an increasing interest in developing generic sensors and generic IoT Platforms that use standard protocols for communication and that can be interconnected without limitations, the obvious solution would be building an IoT platform based on open-source software and hardware.

2.3 User participation in thermal comfort case studies

The importance of selecting an adequate and effective method for collecting data and creating a good dataset has already been mentioned. However, in studies that involve people’s participation, like thermal comfort studies, the choice of the most appropriate data collection method to use is not the only thing that must be taken into consideration.

5 https://www.develcoproducts.com/products/starter-kits/ 6http://www.wuliangroup.com/en/ 7 https://www.sensirion.com/kr/environmental-sensors/evaluation-kit-sek-environmental-sensing 8 https://www.fierceelectronics.com/components/comfort-health-and-convenience-are-roles-sensors-smart-home 9http://www.climax.com.tw/home-automation.php 10 https://store.centralite.com/ 11 https://www.4-noks.com/products/?lang=en

(46)

In [34], it was stated the study can not disrupt the daily life of the participants. The author of In-use monitoring of buildings: An overview of data collection methods also reflected on some other aspects described below.

The study must be explained in detail to the occupants, and their consent must be expressly collected before they start to participate in the study.

In studies like this, the users frequently worry about the disadvantages the study can bring to them, such as feeling under constant vigilance or the risks of providing personal data or monitoring their daily routines, which reduces their willingness to take part in the study. Users must be free to end their participation whenever they want. Hence, it is important to engage them in order to reduce the possibility of dropping out of the research project. In addition, the participants should feel supported during the entire time they are involved in the study.

It is crucial to ensure the privacy of occupants and a non-disclosure policy for personal data. The platform’s architecture design must guarantee these requirements, which must also be satisfied in the practical implementation. In addition, the design must also comply with the country’s Data Protection Act. The most recent General Data Protection Regulation that took effect in Portugal enforces to inform the participants about the legal base of data processing and to obtain their consent. It also forces one to calculate the risk of data theft and implement security measures in order to ensure confidentiality and integrity and prevent non-authorized disclosure of data [35].

Besides what was already exposed above, other ethical considerations must also be taken into account [36]. The most important principles that were not already mentioned were presented:

• Research participants should not be subjected to harm in any way whatsoever. • Respect for the dignity of research participants should be prioritized.

• Any deception or exaggeration about the aims and objectives of the research must be avoided.

• Affiliations in any form, sources of funding, as well as any possible conflicts of interests have to be declared.

• Any type of communication concerning the research should be done with honesty and transparency.

• Any type of misleading information, as well as the representation of primary data findings in a biased way, must be avoided.

2.4 Related work

In this section, some studies about indoor thermal comfort are going to be analyzed in detail.

Identifying a suitable method for studying thermal comfort in people’s homes

(47)

The less intrusive method consisted in installing a HOBO data logger in the living room to collect temperature and humidity measurements while the user was reading a book or watching TV for about 60 minutes and collect user comfort perception by making him answering a questionnaire at the end of the experiment session.

The rigorous method was being carried out by the experimenter that would visit the living room at the end of each session and take note of environment measurements according to the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) standard, using laboratory-grade instruments and about the participants’ clothes and activity levels.

After comparing the analysis results of the two methods, the authors concluded that the proposed method and most of their assumptions, like temperature being homogeneous in the entire room and airspeed being equal to or lower than 0.1m/sec, were accurate enough.

This conclusion can be helpful for future thermal comfort studies that have minimal resources available because it has proven that results calculated based on assumptions are reliable.

However, this method probably isn’t the most reliable one, because participants context-awareness can interfere with their comfort perception, neither the most practical one, because participants were submitted to a detailed questionnaire.

The proposed method also isn’t the most adequate for long term readings neither for large scale thermal comfort studies.

Open Source Building Science Sensors (OSBSS): A low-cost Arduino-based plat-form for long-term indoor environmental data collection

In [32], a low-cost data acquisition hardware prototype for long-term indoor environmental data collection was developed. This project focused on using open-source software and hardware in order to improve flexibility and integration.

The sensor prototypes included an Arduino board (Pro Mini), a lithium-ion polymer battery as a power source, a microSD card for data storage and one specific small electronic sensor (US Sensor ultra-precision NTC thermistor for temperature, Sensirion SHT15 digital humidity sensor for relative humidity and airstream temperature, Parallax PIR mini sensor for occupant proximity, SenseAir K-30 for CO 2, TAOS TSL2561 luminosity sensor for light intensity, or Texas Instruments ADS1115 16-bit ADC for a generic analog voltage data logger).

The authors took into consideration a common mobile sensor problem power draw -and implemented power saving strategies, like sleep mode, which disables board modules or functionalities when they’re not being used, reducing energy consumption.

The prototypes were compared against the respective Onset HOBO sensors, and both sensors output very similar results, which validated the solution. However, the loggers require to be manually assembled, soldered, and programmed. Besides being extremely time-consuming, which affects scalability, this process potentiates sources of error and debugging these circuits can be relatively difficult.

Referências

Documentos relacionados

A partir de Entre de lado, Measuring blackness and a guide to many other industries e Supõe que a verdade fosse uma mulher… E porque não?, examino de que forma a obra de Eurídice

II - Macroeconomic factors: Credit Risk is measured by the US High Yield OAS index, and if it is above (below) its 6 months moving average an underweight (overweight)

Neste número da revista Interacções apresentamos estudos sobre a vinculação, a qualidade das práticas educativas na primeira infância e experiências de equipas de intervenção

Em relação ao ICE, as cultivares MF 1001, Colorado Pop 1, Pirapoca-Amarela e Pirapoca-Bran- ca demonstram capacidade satisfatória no aproveita- mento dos estímulos ambientais, o

Estimativas, dos coeficientes de herdabilidade no sentido restrito, ao ní- vel de plantas individuais (li?), seleção massal deatro de progênies (h) e entre médias de progênies

O uso do cariótipo parental deve ser seletivo, privilegiando casais com fatores de risco elevado para estado portador, (57) como idade materna baixa aquando do

No centro conceptual das redes está a formação de Capital Social assente num quadro de normas de reciprocidade, informação e confiança presentes na rede e que se estruturam a partir

Neste sentido, o enfermeiro contribui na elaboração de uma assistência planejada, fundamentada na educação em saúde, nas orientações sobre o auto cuidado,