#RC Lens flag
Explore tagged Tumblr posts
Text
Download wallpapers 4k, RC Lens logo, red yellow silk fabric, French football team, RC Lens emblem, Ligue 1, RC Lens, France, football, RC Lens flag, FC Lens for desktop free
#4k#RC Lens logo#red yellow silk fabric#French football team#RC Lens emblem#Ligue 1#RC Lens#France#football#RC Lens flag#FC Lens
2 notes
·
View notes
Text
Looking at items that might be fun to make from the Zelda series:
Master Sword variations (Goddess Sword included)
Golden Feather (possibly as a hair pin?)
Gilded Sword/Razor Sword and the Great Fairy Sword
Water Dragon’s Scale
Goddess Harp/Sheik’s Harp
Shield variations
Masks from Majora’s Mask
Megaton Hammer hard or soft
Hover Boots (possibly as slippers?)
Deku Princess in a bottle
Goddess Plume
Joy pendant
Amber and Dusk relics
Spiritual Stones
Lens of Truth
Moons Tear
RC Bombchu
Skull Necklace
Sickle Moon Flag or Hero’s Flag from Wind Waker
Light Up Poe Lanterns perhaps with Poe Souls inside
Light up Poe/Blue Flame bottles
I dunno. These are just things I think are neat.
Maybe attempt to do Agitha’s Umbrella. Or some NPC items.
These are just the first things I’m throwing around as ideas.
2 notes
·
View notes
Text
Demand for Contact Lens Grows Strong amid Economic Turmoil Spurred by COVID-19 Pandemic Fact.MR Report
The global Contact Lens market study presents an all in all compilation of the historical, current and future outlook of the market as well as the factors responsible for such a growth. With SWOT analysis, the business study highlights the strengths, weaknesses, opportunities and threats of each Contact Lens market player in a comprehensive way. Further, the Contact Lens market report emphasizes the adoption pattern of the Contact Lens across various industries. The global Contact Lens market has seen a historical CAGR of nearly 7% during the period (2019-2029) and is projected to create a valuation of about US$ XX Mn/Bn by 2029.
The Contact Lens market report highlights the following players:
· Johnson & Johnson
· Alcon Vision LLC
· Bausch & Lomb
· Incorporated
· EISS International
· others
The Contact Lens market report examines the operating pattern of each player – new product launches, partnerships, and acquisitions – has been examined in detail.
Request to Download a Sample of Research Report @
https://www.factmr.com/connectus/sample?flag=S&rep_id=4536
The Covid-19 (corona virus) pandemic is impacting society and the overall economy across the world. The impact of this pandemic is growing day by day as well as affecting the supply chain. The COVID-19 crisis is creating uncertainty in the stock market, massive slowing of supply chain, falling business confidence, and increasing panic among the customer segments. The overall effect of the pandemic is impacting the production process of several industries, and many more. Trade barriers are further restraining the demand- supply outlook. As government of different regions have already announced total lockdown and temporarily shutdown of industries, the overall production process being adversely affected; thus, hinder the overall Contact Lens Market globally. This report on ‘Contact Lens market’ provides the analysis on impact on Covid-19 on various business segments and country markets. The report also showcase market trends and forecast, factoring the impact of Covid -19 Situation.
Important regions covered in the Contact Lens market report include:
· North America (U.S., Canada)
· Latin America (Brazil, Mexico, Argentina, Rest of Latin America)
· Europe (Germany, Italy, France, U.K., Spain, Benelux, Russia, Rest of Europe)
· East Asia (China, Japan, South Korea)
· Japan
· APEJ (China, India, Indonesia, Thailand, Singapore, Australia & New Zealand, Rest of Asia Pacific)
· South Asia & Oceania (India, Thailand, Indonesia, Malaysia, Australia & New Zealand, Rest of South Asia & Oceania)
· Middle East & Africa (GCC Countries, Turkey, Northern Africa, South Africa, Rest of Middle East & Africa)
The Contact Lens market report takes into consideration the following segments by material type:
· Gas Permeable
· Silicone Hydrogel
· Hybrid
The Contact Lens market report contain the following end uses:
· Corrective
· Therapeutic
· Cosmetic
· Prosthetic
· Lifestyle-oriented
A Customization of this Report is Available upon Request @
https://www.factmr.com/connectus/sample?flag=RC&rep_id=4536
The Contact Lens market report offers a plethora of insights which include:
· Changing consumption pattern among individuals globally.
· Historical and future progress of the global Contact Lens market.
· Region-wise and country-wise segmentation of the Contact Lens market to understand the revenue, and growth lookout in these areas.
· Accurate Year-on-Year growth of the global Contact Lens market.
· Important trends, including proprietary technologies, ecological conservation, and globalization affecting the global Contact Lens market.
The Contact Lens market report answers important questions which include:
· Which regulatory authorities have granted approval to the application of Contact Lens in xx industry?
· How will the global Contact Lens market grow over the forecast period?
· Which end use industry is set to become the leading consumer of Contact Lens by 2029?
· What manufacturing techniques are involved in the production of the Contact Lens?
· Which regions are the Contact Lens market players targeting to channelize their production portfolio?
The Contact Lens market report considers the following years to predict the market growth:
· Historic Year: 2014 - 2018
· Base Year: 2014
· Estimated Year: 2019
· Forecast Year: 2019 – 2029
Why Choose Fact.MR?
Fact.MR follows a multi- disciplinary approach to extract information about various industries. Our analysts perform thorough primary and secondary research to gather data associated with the market. With modern industrial and digitalization tools, we provide avant-garde business ideas to our clients. We address clients living in across parts of the world with our 24/7 service availability.
0 notes
Text
Developing a Disaster Monitoring Service with Solace PubSub+
MakeUofT is Canada’s largest makeathon. Similar to a hackathon, it’s a place where projects come to life; a makeathon focuses on hardware-based projects where students build something from scratch through hardware and software integration.
The theme for MakeUofT 2020 was connectivity and machine learning. As a prime sponsor of the event, Solace challenged participants to make the best use of Solace PubSub+ Event Broker to stream events and information across cloud, on-premises, and IoT environments.
Alok Deshpande, a graduate from the University of Toronto in Electrical Engineering, participated with his group members in MakeUofT 2020 and chose Solace’s technology to make their project. Below, he shares how he and his group designed and developed their project.
Inspiration for the Project
Coronavirus is currently all you hear about in the news, and given that it’s a worldwide pandemic, this is quite reasonable. However, emergency events like disasters are still going on, even if you don’t hear about them as much. They can lead to the loss of lives and resources because of delays in notifying authorities, especially during times like these.
Natural disasters in remote areas, in particular, can go unnoticed because of personal and political reasons. For example, last June there was an issue here in Canada where wildfires were allowed to burn for extended periods of time because there weren’t enough members of staff available to monitor them. While detection by satellite imaging helps, clearly automatic real-time notifications from disposable sensors in the field would have provided supplementary support.
Sounds like a job for… Solace PubSub+ Event Broker!
The Telus-Telyou Project
Back in February, I worked on a team makeathon project at MakeUofT 2020 to develop the proof-of-concept of a monitoring service for this purpose, which we called Telus-Telyou. The goal was to develop connected sensor nodes that would automatically trigger warnings on connected endpoints based on their specific use. For example, in the case of fire detection, the endpoint might belong to a fire department.
There were three main requirements in this project:
The nodes should be transmitting over a cellular connection to enable both long-distance and reliable monitoring.
Many nodes should be able to provide a stream of notifications to many endpoints.
To ensure the nodes remained low-power, until a message would be provided by an end-user, the event would be recorded in low resolution
Fig. 1: General concept diagram
As shown in the diagram above, the connected nodes would transmit data to the cloud, to which multiple users could subscribe to receive notifications. Users could also publish messages to configure the nodes.
Our Design
Based on the available resources in the makeathon, my team and I decided that the sensory data could simply be temperature, humidity, GPS readings, and photos from a camera. These would be collected from a gateway computer and transmitted over LTE.
On the other end, the three endpoints were chosen to be social media (Twitter), mapping (Google Maps), and cloud computing services (Microsoft Azure) since these would simulate real-world clients in this proof-of-concept. The cloud service was chosen in particular because it would allow for integrating AI models to make predictions based on the raw sensor feed, as described in the section below on the extension to this project. Mapping was of course incorporated to ensure the end-user would be able to identify the location of the hazard.
Fig. 2: Diagram of the system architecture
Between them, we chose to incorporate an event broker to handle the messaging, based on the second and third requirements. After looking into some options (including Azure’s own broker), we decided that Solace PubSub+ Event Broker was the way to go because it wouldn’t tie the project to any given service.
Building Telus-Telyou
To physically build the project, we used a Raspberry Pi computer as the gateway, connected to a Telus LTE modem shield and a RaspiCam camera. My teammates interfaced the sensors on the HAT as well as the camera to the Pi using provided APIs. We then configured the modem to transmit point-to-point so that we’d be able to communicate over TCP to PubSub+ Event Broker.
Fig. 3: Hardware – Raspberry Pi and LTE shield
After this came the fun part – setting up the event broker!
I used the Paho MQTT client to set up the Raspberry Pi to continuously publish the temperature, humidity, and GPS data, since these are tiny payloads.
class IoT_Client(paho.mqtt.client.Client): def __init__(self,node_ID): self.clean_session=False self.publish_detailed = False self.node_ID = 0 #User-defined #A dictionary of requested measurements, with numeric keys and string identifiers as values self.sensor_data_enum = { idx : type_str for idx, type_str in enumerate(['temp', 'humidity', 'lat', 'longitude'])} def on_connect(self, client, userdata, flags, rc): self.subscribe("nodes/{self.node_ID}/detailed_request", 0) def add_measurement(meas_name:str): self.self.sensor_data_enum[len(self.self.sensor_data_enum)+1] = meas_name def getPubTopic(datatype_idx, event_stream): return f"nodes/{self.node_ID}/{self.sensor_data_enum[datatype_idx]}/{event_stream}" def getPhotoPubTopic(event_stream): return f"nodes/{self.node_ID}/camera/{event_stream}" ...... ...... try: #Get sensor feed continuously. Could be modified to pause conditionally while True: #Get sensor data from wrapper. Input is the dictionary of requested measurements # Returns sentinel NaN if a measurement type is absent or the measurements are faulty sensor_data, timestamp = getSensorData(iot_client.sensor_data_enum) for val, idx in enumerate(sensor_data): #For each measurement type in data requested… if faulty publish diagnostics if val == np.nan: iot_client.publish( getPubTopic(idx, 'diagnostics'), json.dumps({"error":'Absent or faulty'}) ) else: payload = {"node": iot_client.node_ID, "timestamp":timestamp, iot_client.sensor_data_enum[idx]: val} payload = json.dumps(payload,indent=4) iot_client.publish( getPubTopic(idx, 'measure') , payload) #…Otherwise publish measurements ....... except: iot_client.loop_stop() iot_client.disconnect()
I also set it up to subscribe to requests to publish photos, since they would be relatively larger messages and should only be sent infrequently.
....... #Set flag to publish more detailed information only when receive callback def on_message(client, userdata, msg): client.publish_detailed=True ...... iot_client.on_message = on_message ..... #Get sensor feed continuously. Could be modified to pause conditionally while True: ...... #Publish more detailed messages (just photo stream, for now) if(publish_detailed==True): publish_detailed = False #Clear flag once request acknowledged image_contents_arr = getImage() #Wrapper to take image using RaspiCam and read from file im_bytes = bytearray(image_contents_arr) #Convert to byte array for serialization im_payload = json.dumps( {"node": client.node_ID, "timestamp": timestamp, "camera": im_bytes}, indent=4) iot_client.publish( getPhotoPubTopic(iot_client.node_ID, "measure"), im_payload ) .......
Specifically, these requests would come from the end client, which would publish based on the end user’s interaction (not shown here). For this use case, we assumed that the end-user would review the sensory data and make a judgement based on the likelihood of an emergency as to whether to request a photo stream.
Finally, we used Node-Red to connect the Twitter, Azure, and Google Maps endpoints.
Fig. 4: Picture of node-red flow
The following screenshot shows part of the system in action, as it publishes messages to Twitter when the temperature exceeds an arbitrary threshold:
Fig. 5: Image of Tweets
Extension: Smart Public Camera
After the makeathon, we decided that this system could be greatly improved by leveraging the AI capabilities on Azure. This would allow end-users to avoid having to constantly monitor it. It would also solve the problem of having to run inference on the edge device, since many (unlike the Raspberry Pi) have limited memory.
To achieve this, we set up a pipeline on Azure ML where a Stream Analytics job would consume the messages from IoT Hub and a predictive model and then feed the results back to IoT Hub. From there, the prediction using the sensor data was published to a new topic called ‘/prediction’, as shown below. Other clients could then subscribe to this prediction.
Fig. 6: Picture of modified node-red flow
Unfortunately, this use case didn’t apply for our original application, since there were no disaster prediction AI models we found that would make use of our limited sensor data.
The alternative application we considered was in civil emergencies, such as identifying the need for help from cameras installed in a city. Since we had a RaspiCam, we imagined it could be installed similarly and an SOS gesture detected. After finding a pre-built gesture detector on Azure, we fed to it images of hand gestures taken with the RaspiCam and had the AI model predict whether the hand was raised.
The video below shows the prediction from the model – a bounding box when the gesture is detected – superimposed on the feed from the camera. Incidentally, the choppiness (low frame rate) is due to a combination of the slow rate of inference and of the rate of MQTT publications.
https://solace.com/wp-content/uploads/2020/06/Video.mp4
What We Learned
For many of us, this was our first time working on an IoT project, so it was fun to see how easily a physical device could be connected to the cloud and web services we are familiar with. My team and I discovered how easy-to-use and versatile Solace’s PubSub+ Platform is for this kind of applications. The started examples in Python and Node.js were helpful to get up and running, and the Web Messaging demo (‘Try Me!’) particularly helped for just testing out the API. We also learned a lot about the event-driven architecture from talking with Solace employees, who were very helpful. For our next project, we hope to try other Solace messaging platforms!
Check It Out Here
For more information, you can find our project on GitHub.
Alok Deshpande is a recent graduate from the University of Toronto in Electrical Engineering. His interests lie in embedded development and machine learning for IoT applications.
The post Developing a Disaster Monitoring Service with Solace PubSub+ appeared first on Solace.
Developing a Disaster Monitoring Service with Solace PubSub+ published first on https://jiohow.tumblr.com/
0 notes
Text
Developing a Disaster Monitoring Service with Solace PubSub+
MakeUofT is Canada’s largest makeathon. Similar to a hackathon, it’s a place where projects come to life; a makeathon focuses on hardware-based projects where students build something from scratch through hardware and software integration.
The theme for MakeUofT 2020 was connectivity and machine learning. As a prime sponsor of the event, Solace challenged participants to make the best use of Solace PubSub+ Event Broker to stream events and information across cloud, on-premises, and IoT environments.
Alok Deshpande, a graduate from the University of Toronto in Electrical Engineering, participated with his group members in MakeUofT 2020 and chose Solace’s technology to make their project. Below, he shares how he and his group designed and developed their project.
Inspiration for the Project
Coronavirus is currently all you hear about in the news, and given that it’s a worldwide pandemic, this is quite reasonable. However, emergency events like disasters are still going on, even if you don’t hear about them as much. They can lead to the loss of lives and resources because of delays in notifying authorities, especially during times like these.
Natural disasters in remote areas, in particular, can go unnoticed because of personal and political reasons. For example, last June there was an issue here in Canada where wildfires were allowed to burn for extended periods of time because there weren’t enough members of staff available to monitor them. While detection by satellite imaging helps, clearly automatic real-time notifications from disposable sensors in the field would have provided supplementary support.
Sounds like a job for… Solace PubSub+ Event Broker!
The Telus-Telyou Project
Back in February, I worked on a team makeathon project at MakeUofT 2020 to develop the proof-of-concept of a monitoring service for this purpose, which we called Telus-Telyou. The goal was to develop connected sensor nodes that would automatically trigger warnings on connected endpoints based on their specific use. For example, in the case of fire detection, the endpoint might belong to a fire department.
There were three main requirements in this project:
The nodes should be transmitting over a cellular connection to enable both long-distance and reliable monitoring.
Many nodes should be able to provide a stream of notifications to many endpoints.
To ensure the nodes remained low-power, until a message would be provided by an end-user, the event would be recorded in low resolution
Fig. 1: General concept diagram
As shown in the diagram above, the connected nodes would transmit data to the cloud, to which multiple users could subscribe to receive notifications. Users could also publish messages to configure the nodes.
Our Design
Based on the available resources in the makeathon, my team and I decided that the sensory data could simply be temperature, humidity, GPS readings, and photos from a camera. These would be collected from a gateway computer and transmitted over LTE.
On the other end, the three endpoints were chosen to be social media (Twitter), mapping (Google Maps), and cloud computing services (Microsoft Azure) since these would simulate real-world clients in this proof-of-concept. The cloud service was chosen in particular because it would allow for integrating AI models to make predictions based on the raw sensor feed, as described in the section below on the extension to this project. Mapping was of course incorporated to ensure the end-user would be able to identify the location of the hazard.
Fig. 2: Diagram of the system architecture
Between them, we chose to incorporate an event broker to handle the messaging, based on the second and third requirements. After looking into some options (including Azure’s own broker), we decided that Solace PubSub+ Event Broker was the way to go because it wouldn’t tie the project to any given service.
Building Telus-Telyou
To physically build the project, we used a Raspberry Pi computer as the gateway, connected to a Telus LTE modem shield and a RaspiCam camera. My teammates interfaced the sensors on the HAT as well as the camera to the Pi using provided APIs. We then configured the modem to transmit point-to-point so that we’d be able to communicate over TCP to PubSub+ Event Broker.
Fig. 3: Hardware – Raspberry Pi and LTE shield
After this came the fun part – setting up the event broker!
I used the Paho MQTT client to set up the Raspberry Pi to continuously publish the temperature, humidity, and GPS data, since these are tiny payloads.
class IoT_Client(paho.mqtt.client.Client): def __init__(self,node_ID): self.clean_session=False self.publish_detailed = False self.node_ID = 0 #User-defined #A dictionary of requested measurements, with numeric keys and string identifiers as values self.sensor_data_enum = { idx : type_str for idx, type_str in enumerate(['temp', 'humidity', 'lat', 'longitude'])} def on_connect(self, client, userdata, flags, rc): self.subscribe("nodes/{self.node_ID}/detailed_request", 0) def add_measurement(meas_name:str): self.self.sensor_data_enum[len(self.self.sensor_data_enum)+1] = meas_name def getPubTopic(datatype_idx, event_stream): return f"nodes/{self.node_ID}/{self.sensor_data_enum[datatype_idx]}/{event_stream}" def getPhotoPubTopic(event_stream): return f"nodes/{self.node_ID}/camera/{event_stream}" ...... ...... try: #Get sensor feed continuously. Could be modified to pause conditionally while True: #Get sensor data from wrapper. Input is the dictionary of requested measurements # Returns sentinel NaN if a measurement type is absent or the measurements are faulty sensor_data, timestamp = getSensorData(iot_client.sensor_data_enum) for val, idx in enumerate(sensor_data): #For each measurement type in data requested… if faulty publish diagnostics if val == np.nan: iot_client.publish( getPubTopic(idx, 'diagnostics'), json.dumps({"error":'Absent or faulty'}) ) else: payload = {"node": iot_client.node_ID, "timestamp":timestamp, iot_client.sensor_data_enum[idx]: val} payload = json.dumps(payload,indent=4) iot_client.publish( getPubTopic(idx, 'measure') , payload) #…Otherwise publish measurements ....... except: iot_client.loop_stop() iot_client.disconnect()
I also set it up to subscribe to requests to publish photos, since they would be relatively larger messages and should only be sent infrequently.
....... #Set flag to publish more detailed information only when receive callback def on_message(client, userdata, msg): client.publish_detailed=True ...... iot_client.on_message = on_message ..... #Get sensor feed continuously. Could be modified to pause conditionally while True: ...... #Publish more detailed messages (just photo stream, for now) if(publish_detailed==True): publish_detailed = False #Clear flag once request acknowledged image_contents_arr = getImage() #Wrapper to take image using RaspiCam and read from file im_bytes = bytearray(image_contents_arr) #Convert to byte array for serialization im_payload = json.dumps( {"node": client.node_ID, "timestamp": timestamp, "camera": im_bytes}, indent=4) iot_client.publish( getPhotoPubTopic(iot_client.node_ID, "measure"), im_payload ) .......
Specifically, these requests would come from the end client, which would publish based on the end user’s interaction (not shown here). For this use case, we assumed that the end-user would review the sensory data and make a judgement based on the likelihood of an emergency as to whether to request a photo stream.
Finally, we used Node-Red to connect the Twitter, Azure, and Google Maps endpoints.
Fig. 4: Picture of node-red flow
The following screenshot shows part of the system in action, as it publishes messages to Twitter when the temperature exceeds an arbitrary threshold:
Fig. 5: Image of Tweets
Extension: Smart Public Camera
After the makeathon, we decided that this system could be greatly improved by leveraging the AI capabilities on Azure. This would allow end-users to avoid having to constantly monitor it. It would also solve the problem of having to run inference on the edge device, since many (unlike the Raspberry Pi) have limited memory.
To achieve this, we set up a pipeline on Azure ML where a Stream Analytics job would consume the messages from IoT Hub and a predictive model and then feed the results back to IoT Hub. From there, the prediction using the sensor data was published to a new topic called ‘/prediction’, as shown below. Other clients could then subscribe to this prediction.
Fig. 6: Picture of modified node-red flow
Unfortunately, this use case didn’t apply for our original application, since there were no disaster prediction AI models we found that would make use of our limited sensor data.
The alternative application we considered was in civil emergencies, such as identifying the need for help from cameras installed in a city. Since we had a RaspiCam, we imagined it could be installed similarly and an SOS gesture detected. After finding a pre-built gesture detector on Azure, we fed to it images of hand gestures taken with the RaspiCam and had the AI model predict whether the hand was raised.
The video below shows the prediction from the model – a bounding box when the gesture is detected – superimposed on the feed from the camera. Incidentally, the choppiness (low frame rate) is due to a combination of the slow rate of inference and of the rate of MQTT publications.
https://solace.com/wp-content/uploads/2020/06/Video.mp4
What We Learned
For many of us, this was our first time working on an IoT project, so it was fun to see how easily a physical device could be connected to the cloud and web services we are familiar with. My team and I discovered how easy-to-use and versatile Solace’s PubSub+ Platform is for this kind of applications. The started examples in Python and Node.js were helpful to get up and running, and the Web Messaging demo (‘Try Me!’) particularly helped for just testing out the API. We also learned a lot about the event-driven architecture from talking with Solace employees, who were very helpful. For our next project, we hope to try other Solace messaging platforms!
Check It Out Here
For more information, you can find our project on GitHub.
Alok Deshpande is a recent graduate from the University of Toronto in Electrical Engineering. His interests lie in embedded development and machine learning for IoT applications.
The post Developing a Disaster Monitoring Service with Solace PubSub+ appeared first on Solace.
Developing a Disaster Monitoring Service with Solace PubSub+ published first on https://jiohow.tumblr.com/
0 notes
Text
Developing a Disaster Monitoring Service with Solace PubSub+
MakeUofT is Canada’s largest makeathon. Similar to a hackathon, it’s a place where projects come to life; a makeathon focuses on hardware-based projects where students build something from scratch through hardware and software integration.
The theme for MakeUofT 2020 was connectivity and machine learning. As a prime sponsor of the event, Solace challenged participants to make the best use of Solace PubSub+ Event Broker to stream events and information across cloud, on-premises, and IoT environments.
Alok Deshpande, a graduate from the University of Toronto in Electrical Engineering, participated with his group members in MakeUofT 2020 and chose Solace’s technology to make their project. Below, he shares how he and his group designed and developed their project.
Inspiration for the Project
Coronavirus is currently all you hear about in the news, and given that it’s a worldwide pandemic, this is quite reasonable. However, emergency events like disasters are still going on, even if you don’t hear about them as much. They can lead to the loss of lives and resources because of delays in notifying authorities, especially during times like these.
Natural disasters in remote areas, in particular, can go unnoticed because of personal and political reasons. For example, last June there was an issue here in Canada where wildfires were allowed to burn for extended periods of time because there weren’t enough members of staff available to monitor them. While detection by satellite imaging helps, clearly automatic real-time notifications from disposable sensors in the field would have provided supplementary support.
Sounds like a job for… Solace PubSub+ Event Broker!
The Telus-Telyou Project
Back in February, I worked on a team makeathon project at MakeUofT 2020 to develop the proof-of-concept of a monitoring service for this purpose, which we called Telus-Telyou. The goal was to develop connected sensor nodes that would automatically trigger warnings on connected endpoints based on their specific use. For example, in the case of fire detection, the endpoint might belong to a fire department.
There were three main requirements in this project:
The nodes should be transmitting over a cellular connection to enable both long-distance and reliable monitoring.
Many nodes should be able to provide a stream of notifications to many endpoints.
To ensure the nodes remained low-power, until a message would be provided by an end-user, the event would be recorded in low resolution
Fig. 1: General concept diagram
As shown in the diagram above, the connected nodes would transmit data to the cloud, to which multiple users could subscribe to receive notifications. Users could also publish messages to configure the nodes.
Our Design
Based on the available resources in the makeathon, my team and I decided that the sensory data could simply be temperature, humidity, GPS readings, and photos from a camera. These would be collected from a gateway computer and transmitted over LTE.
On the other end, the three endpoints were chosen to be social media (Twitter), mapping (Google Maps), and cloud computing services (Microsoft Azure) since these would simulate real-world clients in this proof-of-concept. The cloud service was chosen in particular because it would allow for integrating AI models to make predictions based on the raw sensor feed, as described in the section below on the extension to this project. Mapping was of course incorporated to ensure the end-user would be able to identify the location of the hazard.
Fig. 2: Diagram of the system architecture
Between them, we chose to incorporate an event broker to handle the messaging, based on the second and third requirements. After looking into some options (including Azure’s own broker), we decided that Solace PubSub+ Event Broker was the way to go because it wouldn’t tie the project to any given service.
Building Telus-Telyou
To physically build the project, we used a Raspberry Pi computer as the gateway, connected to a Telus LTE modem shield and a RaspiCam camera. My teammates interfaced the sensors on the HAT as well as the camera to the Pi using provided APIs. We then configured the modem to transmit point-to-point so that we’d be able to communicate over TCP to PubSub+ Event Broker.
Fig. 3: Hardware – Raspberry Pi and LTE shield
After this came the fun part – setting up the event broker!
I used the Paho MQTT client to set up the Raspberry Pi to continuously publish the temperature, humidity, and GPS data, since these are tiny payloads.
class IoT_Client(paho.mqtt.client.Client): def __init__(self,node_ID): self.clean_session=False self.publish_detailed = False self.node_ID = 0 #User-defined #A dictionary of requested measurements, with numeric keys and string identifiers as values self.sensor_data_enum = { idx : type_str for idx, type_str in enumerate(['temp', 'humidity', 'lat', 'longitude'])} def on_connect(self, client, userdata, flags, rc): self.subscribe("nodes/{self.node_ID}/detailed_request", 0) def add_measurement(meas_name:str): self.self.sensor_data_enum[len(self.self.sensor_data_enum)+1] = meas_name def getPubTopic(datatype_idx, event_stream): return f"nodes/{self.node_ID}/{self.sensor_data_enum[datatype_idx]}/{event_stream}" def getPhotoPubTopic(event_stream): return f"nodes/{self.node_ID}/camera/{event_stream}" ...... ...... try: #Get sensor feed continuously. Could be modified to pause conditionally while True: #Get sensor data from wrapper. Input is the dictionary of requested measurements # Returns sentinel NaN if a measurement type is absent or the measurements are faulty sensor_data, timestamp = getSensorData(iot_client.sensor_data_enum) for val, idx in enumerate(sensor_data): #For each measurement type in data requested… if faulty publish diagnostics if val == np.nan: iot_client.publish( getPubTopic(idx, 'diagnostics'), json.dumps({"error":'Absent or faulty'}) ) else: payload = {"node": iot_client.node_ID, "timestamp":timestamp, iot_client.sensor_data_enum[idx]: val} payload = json.dumps(payload,indent=4) iot_client.publish( getPubTopic(idx, 'measure') , payload) #…Otherwise publish measurements ....... except: iot_client.loop_stop() iot_client.disconnect()
I also set it up to subscribe to requests to publish photos, since they would be relatively larger messages and should only be sent infrequently.
....... #Set flag to publish more detailed information only when receive callback def on_message(client, userdata, msg): client.publish_detailed=True ...... iot_client.on_message = on_message ..... #Get sensor feed continuously. Could be modified to pause conditionally while True: ...... #Publish more detailed messages (just photo stream, for now) if(publish_detailed==True): publish_detailed = False #Clear flag once request acknowledged image_contents_arr = getImage() #Wrapper to take image using RaspiCam and read from file im_bytes = bytearray(image_contents_arr) #Convert to byte array for serialization im_payload = json.dumps( {"node": client.node_ID, "timestamp": timestamp, "camera": im_bytes}, indent=4) iot_client.publish( getPhotoPubTopic(iot_client.node_ID, "measure"), im_payload ) .......
Specifically, these requests would come from the end client, which would publish based on the end user’s interaction (not shown here). For this use case, we assumed that the end-user would review the sensory data and make a judgement based on the likelihood of an emergency as to whether to request a photo stream.
Finally, we used Node-Red to connect the Twitter, Azure, and Google Maps endpoints.
Fig. 4: Picture of node-red flow
The following screenshot shows part of the system in action, as it publishes messages to Twitter when the temperature exceeds an arbitrary threshold:
Fig. 5: Image of Tweets
Extension: Smart Public Camera
After the makeathon, we decided that this system could be greatly improved by leveraging the AI capabilities on Azure. This would allow end-users to avoid having to constantly monitor it. It would also solve the problem of having to run inference on the edge device, since many (unlike the Raspberry Pi) have limited memory.
To achieve this, we set up a pipeline on Azure ML where a Stream Analytics job would consume the messages from IoT Hub and a predictive model and then feed the results back to IoT Hub. From there, the prediction using the sensor data was published to a new topic called ‘/prediction’, as shown below. Other clients could then subscribe to this prediction.
Fig. 6: Picture of modified node-red flow
Unfortunately, this use case didn’t apply for our original application, since there were no disaster prediction AI models we found that would make use of our limited sensor data.
The alternative application we considered was in civil emergencies, such as identifying the need for help from cameras installed in a city. Since we had a RaspiCam, we imagined it could be installed similarly and an SOS gesture detected. After finding a pre-built gesture detector on Azure, we fed to it images of hand gestures taken with the RaspiCam and had the AI model predict whether the hand was raised.
The video below shows the prediction from the model – a bounding box when the gesture is detected – superimposed on the feed from the camera. Incidentally, the choppiness (low frame rate) is due to a combination of the slow rate of inference and of the rate of MQTT publications.
https://solace.com/wp-content/uploads/2020/06/Video.mp4
What We Learned
For many of us, this was our first time working on an IoT project, so it was fun to see how easily a physical device could be connected to the cloud and web services we are familiar with. My team and I discovered how easy-to-use and versatile Solace’s PubSub+ Platform is for this kind of applications. The started examples in Python and Node.js were helpful to get up and running, and the Web Messaging demo (‘Try Me!’) particularly helped for just testing out the API. We also learned a lot about the event-driven architecture from talking with Solace employees, who were very helpful. For our next project, we hope to try other Solace messaging platforms!
Check It Out Here
For more information, you can find our project on GitHub.
Alok Deshpande is a recent graduate from the University of Toronto in Electrical Engineering. His interests lie in embedded development and machine learning for IoT applications.
The post Developing a Disaster Monitoring Service with Solace PubSub+ appeared first on Solace.
Developing a Disaster Monitoring Service with Solace PubSub+ published first on https://jiohow.tumblr.com/
0 notes