Welcome to NETDEVOPS Made Easy blog. This was created out a personal need with an education purpose. There are lot’s and lot’s of websites out there that will have the scattered information about networking capabilities, DevOps Tools, Linux Tools and Python. However I have not come across a single repository of “working state” of information for all of these topics.
This Blog targets NETWORK Engineers that want to start their journey on Automation and Network DevOps, that want to learn as they build capabilities. Most of the posts will have theory sections and hands on sections that will allow you to learn, understand why and be able to have a use case up and running.
Network administrators over the years have being challenged with properly monitoring their networks. Too many tools, lack of visibility and complex integrations are some of the challenges of network monitoring. TIG Stack is an open source group of tools that can help with your infrastructure monitoring, some of the bennefits of TIG stack are flexibility and cost. TIG stands for Telegraf, Inlfux and Grafana.
Telegraf – An agent for collecting, processing, aggregating, and producing metrics.
InfluxDB – An open source database, optimized for large amounts of time-stamped data.
Grafana – An open source data visualization tool.
When combined, these services make for an incredibly powerful tool that gives the admin real-time information about your infrastructure.
Data Retention Module Overview
Data Retention Module is one of the 4 Modules of Office Dashboard APP. It is written in python and it is responsible for saving data generated by the other 3 modules and saving it either to log files or an InfluxDB. This module can NOT run as standalone.
This module has 2 different functions, let’s understand what each one does:
write_data(): Responsible to write data into influxDB or log file depending on the configuration.
write_list(): Responsible to recurse a list of entries to be saved by write_data function.
Configuring Data Retention Module
The only configuration to the module that is needed is in credentials.py file described on “Using Data Retention Module” section.
For data visualization on TIG Stack it is required to save the information into influxDB. Next you will need to log into Grafana to start building the Graphs required to visualize all the data. I have saved my Grafana Dashboard as a JSON file, it can be found in this link.
Next I will describe 3 different graphs to give and idea on how you can build your own customized dashboard.
Line Graph
Bar Graph
Pie Chart Graph
Using Data Retention Module
As mentioned before, differently from other modules (Meraki, Webex and DNA Spaces), this module do not run as standalone. It is only invoked by OFFICE-DASHBOARD APP.
Once the app is installed the next step (step 5 in the link above) is to configure/update the credentials.py file, you can choose between 2 options:
Write to InfluxDB: If you choose this option you have to change “save_file” variable to “False”
Write to log files: If you choose this option you have to change “save_file” variable to “True”
(venv) Linux:OFFICE-DASHBOARD$ python3 new-app.py
* Serving Flask app "new-app" (lazy loading)
* Environment: production
WARNING: This is a development server. Do not use it in a production deployment.
Use a production WSGI server instead.
* Debug mode: on
* Running on http://0.0.0.0:5000/ (Press CTRL+C to quit)
Webex Devices Info Collected and Saved! - 0:00:00.447341
DNA Spaces Info Collected and Saved! - 0:00:00.823150
Meraki Clients Info Collected and Saved! - 0:00:02.052085
Meraki Cameras Info Collected and Saved! - 0:00:02.583565
Webex Devices Info Collected and Saved! - 0:00:00.441590
DNA Spaces Info Collected and Saved! - 0:00:00.741098
Meraki Clients Info Collected and Saved! - 0:00:01.527496
Meraki Cameras Info Collected and Saved! - 0:00:02.651590
Webex Devices Info Collected and Saved! - 0:00:00.567153
DNA Spaces Info Collected and Saved! - 0:00:00.827679
Meraki Clients Info Collected and Saved! - 0:00:01.549558
Meraki Dashboard Info Collected and Saved! - 0:00:23.853949
Meraki Cameras Info Collected and Saved! - 0:00:02.505838
Cisco DNA Spaces is a cloud-based location services platform. Through Cisco wireless infrastructure, organizations can gain insights into how people and things move throughout their physical spaces. With these insights, they can deliver contextual engagements that are valuable and relevant. Besides looking at where people go, organizations can also drive operational efficiencies by monitoring the location, movement, and utilization of assets. For example, retail and hospitality organizations can use the engagement toolkits to increase customer visits, revenues, and satisfaction. Through effective asset management, healthcare or workspace customers can save time and costs, and improve their productivity.
Cisco DNA Spaces comes from the July Systems acquisition made in 2018. Over the last couple of years Cisco invested into getting CMX capabilities into DNA Spaces.
(UPDATE) I wrote this post on the early days of COVID-19 and since then DNA Spaces add several new APPs to better help companies on their back to work strategy. You can find more information at thislink.
(UPDATE) A new addition to DNA Spaces is the possibility of integrating Meraki MV Cameras to DNA Spaces APPs for Workspace analytics.
DNA Spaces Module Overview
DNA Spaces analytic capabilities goes well beyond the scope of OFFICE-DASHBORD, thru several different APPs DNA Spaces can add valuable insights into your workspace environment. This module will use only Detect and Locate APP, to retrieve location information on clients connected or probing your Wireless infrastructure.
DNA Spaces Module is one of the 4 Modules of Office Dashboard APP. It is written in python and it is responsible for collecting and receiving data from DNA Spaces cloud service. As some of the other modules, you can run this module as a standalone module, I will detail how in a second. There is no difference in capabilities running it as a standalone or via dashboard APP.
When running as a standalone module it will only collect information from the cloud, it will NOT be able to receive data from it. To receive notifications, it is required to run office dashboard app because of the web socket created by flask to listen to rest post requests with notifications coming from the cloud.
Above you can see the detailed diagram of all functions inside the module. This module has 2 major layers (Collection and Translation), let’s understand what each layer/function does:
Data Collection layer: Responsible for handling communication to the cloud and processing the data collected. Important function within this layer is:
get_clients(): This function is responsible to collect a list of all clients connected or probing your wireless network.
get_elements(): This function is responsible to collect a list of all map elements like building, floors and zones.
get_floor_images(): This function is responsible to collect a list of all floor images.
Data Translation layer: Responsible for manipulating and reformatting the JSON data so it can be sent to the data retention module. Important functions within this layer are:
prime_client(): This function is responsible to prime the json response by transforming all location coordinates to float.
prime_influx(): This function is responsible to prime the json response from the API calls.
Configuring DNA Spaces Module
Create and configure a DNA Spaces Connector VM (link)
Connect your Wireless Lan Controller to your newly deployed Connector (link)
Import Controllers into your Location Hierarchy (link)
Let’s verify if you are getting location data. Go to the top left corner and click on the 3 bars before DNA Spaces to open the menu. Select “Setup” then “Wireless Networks”. On the new page select “Connect via Spaces Connector” section, then “View Connectors” under “Configure Spaces Connector”. On the new page you will see a list of connectors, select the one you just created and click on the “i” icon in the far right, you should see the detailed statistics of the selected connector. You want to see “Data Channel Connection Status:” as Active and the message rate graph in the bottom should display incoming and outgoing messages being exchanged.
Now that your controller is sending location data to DNA Spaces you are ready to configure DNA Spaces to be able to share this information via API.
Create an API key on Detect and Locate APP. (link)
One really important note is that to get a non empty response from DNS Spaces APIs you will need your Access Points associated to a MAP. I did NOT find any public documentation on it, but found out by opening a TAC Case with Cisco. See this link on how to setup map services.
Using DNA Spaces Module
Using this module is very straight forward. As you probably understand by now there are 2 ways of running the module. One is by running OFFICE-DASHBOARD app and let the Flask scheduler do the job and the second which we will use for now (We need all modules configured for the APP to run). So make sure you have the app installed in your linux machine, for that follow this instructions (https://github.com/diegogsoares/OFFICE-DASHBOARD#how-to-install-office-dashboard).
Once the app is installed the next step (step 5 in the link above) is to configure/update the credentials.py file with the information you got while configuring the module. You will need:
Cisco WEBEX is a cloud service that provides best in class Collaboration capabilities like calling, meetings and messaging, Webex was founded in 1995 and aquired by Cisco in 2007. It has been a leader in the market of online meetings for the last few years now, since the acquisition Cisco has spent development efforts on integrating calling and messaging capabilities into the platform and recently adding voice assistant capabilities. Some key differentiators are, it’s Cloud Scale and presence, the integration of Cloud Calling services with meetings and messaging all on the same platform. For more visit http://www.webex.com
WEBEX Module Overview
WEBEX Cloud has several capabilities, however this module only use the ability to cloud manage Video conference units. These intelligent video collaboration systems bring your meeting rooms to life, WEBEX devices are equipped with integrated analytics that enables them to become sensors inside every meeting room. This module will use the xAPI interface of WEBEX Devices to retrieve information like number of people inside the room, light and sound conditions.
Webex Module is one of the 4 Modules of Office Dashboard APP. It is written in python and it is responsible for collecting data from WEBEX cloud service. We decided to connect to WEBEX cloud xAPI broker service instead of connecting straight to the devices, to simplify inventory and authentication to devices. As some of the other modules, you can run this module as a standalone module, I will detail how in a second. There is no difference in capabilites running it as a standalone or via dashboard APP.
Above you can see the detailed diagram of all functions inside the module. This module has 2 major layers (Collection and Translation), within the same function, let’s understand what each layer/function does:
Data Collection layer: Responsible for handling communication to the cloud and processing the data collected. Important function within this layer is:
get_webex_device_details(): This function is responsible to collect a list of all registered devices then get statistics and configuration from each device.
Data Translation layer: Responsible for manipulating and reformatting the JSON data so it can be sent to the data retention module. Important functions within this layer are:
get_webex_device_details(): This function is responsible to built a flat json with all the collected information of each device .
Configuring WEBEX Module
Now that we understand the code and steps performed by it, let’s understand what needs to be configured. Here are the steps of what needs to be configured.
WEBEX Hub account: You will need to have an valid account with WEBEX Hub admin portal. If you don’t have please talk to your WEBEX Contract Administrator.
Registered Webex Room Device: Under Devices section, you will need to have a registered online device. This device will need to be associated with a workspace, refer to this link for steps on how to setup a workspace and associate a device.
Webex Bot Access Token: You will need to create a WEBEX Bot and save its access token, it will be needed for credentials file. See instruction in this link to create a BOT. (Make sure you log in with your Webex HUB account.)
Now that you have a registered device associated to a workspace and a BOT you just need to add permissions for that BOT to be able to collect the information needed via the WEBEX Cloud xAPI broker.
Workspace with API access: You will need log in to your WEBEX Hub account, select Workspaces on the left menu. Select the Workspace that your device is associated to and then click “Edit API Access”. On the new menu you will add the BOT created on a previous stepS. Use Full Access as the access level.
Using Webex Module
Using this module is very straight forward. As you probably understand by now there are 2 ways of running the module. One is by running OFFICE-DASHBOARD app and let the Flask scheduler do the job and the second which we will use for now (We need all modules configured for the APP to run). So make sure you have the app installed in your linux machine, for that follow this instructions (https://github.com/diegogsoares/OFFICE-DASHBOARD#how-to-install-office-dashboard).
Once the app is installed the next step (step 5 in the link above) is to configure/update the credentials.py file with the information you got while configuring the APP. You will need:
Cisco Meraki is a cloud-managed service for IT infrastructure, bought by Cisco Systems in 2012. It provides best in class cloud web dashboard with management capabilities for wireless, switching, security, enterprise mobility management (EMM) and security cameras. Some key differentiators of Meraki dashboard are, it’s ease of use and the ability to integrate all portfolio under the same dashboard. The cloud nature helps to accelerate innovation and bring to market new capabilities, making Meraki customers one of the happiest customers you can find out there.
Meraki Module Overview
Meraki Module is one of the 4 Modules of Office Dashboard APP. It is written in python and it is responsible for collecting and receiving data from Meraki cloud service. As some of the other modules you can run this module as a standalone module, I will detail how in a second.
When running as a standalone module it will only collect information from the dashboard, it will NOT be able to receive data from it. To receive location data and camera alerts it is required to run office dashboard app because of the web socket created by flask to listen to rest post requests with notifications coming from the cloud.
Above you can see the detailed diagram of all functions inside the module. This module has 2 major layers (Collection and Translation) and each module has several different functions, let’s understand what each layer/function does:
Data Collection layer: Responsible for handling communication to the cloud and processing the data collected/received. Important functions within this layer are:
analyze_camera_alert(): – This will be triggered by the MV Webhook alert and will retrieve the image and send it to AWS Rekognition service to identify facial expressions and objects within the image. Once AWS replies back with the image analysis, the answer is sent to prime_aws_rekognition function on the data translation layer to be flatten and later saved on influxDB.
save_meraki_location(): This will be triggered by location post data sent from the cloud. It will validate the post based on the configuration and then send the validate data to prime_meraki_data function on the data translation layer to be flatten and later saved on influxDB.
find_meraki_dashboard(): This function is triggered by the scheduling service when dashboard app is running or it will run once, when in standalone mode. Using Meraki SDK, it will reach out to dashboard and collect information about networks and devices. Once collected the data is sent to prime_meraki_data function on the data translation layer to be flatten and later saved on influxDB.
find_meraki_clients(): This function is triggered by the scheduling service when dashboard app is running or it will run once, when in standalone mode. Using Meraki SDK, it will reach out to dashboard and collect information about Clients and SSIDs. Once collected the data is sent to prime_meraki_data function on the data translation layer to be flatten and later saved on influxDB.
find_meraki_camera(): This function is triggered by the scheduling service when dashboard app is running or it will run once, when in standalone mode. Using Meraki SDK, it will reach out to dashboard and collect information about cameras, and it will take a snapshot to be saved and used on dashboard app. Once collected the data is sent to prime_meraki_data function on the data translation layer to be flatten and later saved on influxDB.
Data Translation layer: Responsible for manipulating and reformatting the JSON data so it can be sent to the data retention module. Important functions within this layer are:
prime_aws_rekognition(): This function is responsible to flatten the JSON response and get the relevant emotions and objects from the analytics result.
prime_meraki_data(): This function is responsible to flatten the JSON response and format the data to be properly saved.
Configuring Meraki Module
Now that we understand the code and steps performed by it, let’s understand what needs to be configured. Here are the steps of what needs to be configured.
Meraki API Key: Find instructions here. Save it you will need to have them to update the credentials file.
Org ID: If you don’t know what your org id is run the script as standalone and it will find it for you. Or follow the instructions below.
On the right side click on Configuration and paste the api key that you got from previous step and hit save. (Make sure use proxy is selected)
On the right side click on run
Below will appear the API response and you will have the Org ID on the id field. Save it you will need to have them to update the credentials file.
Location Posting destination: Back to the dashboard follow steps documented here, save the validator, version and Secret you will need to have them to update the credentials file.
MV motion alerts: Follow steps documented here. DO NOT configure e-mail recipients, use webhooks instead.
Webhook Alerts: Follow steps documented here. Once the HTTP Server is configured, scroll up on that same page to the camera section, enable “Custom recipients for motion alerts” and add the HTTP Server just configured as the recipient.
AWS API: Follow steps documented here. When you are done you should have an Access Key ID and a Secret Access Key. Save them you will need to have them to update the credentials file.
Using Meraki Module
Using this module is very straight forward. As you probably understand by now there are 2 ways of running the module. One is by running OFFICE-DASHBOARD app and let the Flask scheduler do the job and the second which we will use for now (We need all modules configured for the APP to run). So make sure you have the app installed in your linux machine, for that follow this instructions (https://github.com/diegogsoares/OFFICE-DASHBOARD#how-to-install-office-dashboard).
Once the app is installed the next step (step 5 in the link above) is to configure/update the credentials.py file with the information you got while configuring the APP. You will need:
Meraki API Key and Org ID
Meraki Location information (Validator, Version and Secret)
AWS API Information (Access Key, Secret access key and region)
Over the last several years the economy has help to keep the commercial real estate market really hot. Companies were growing at a fast pace, expanding and updating office space has being one of the ways to attract and retain talent. No one wants to work on an office that looks like the 70’s or that big cubicle with high walls that isolates you and prevent you from interacting with colleges. According to CBRE, projections for 2020 are that Demand for office space will remain strong in 2020. Flexible space inventory will continue to increase, but at a slower pace.
(UPDATE) I wrote this post on the early days of COVID-19 and it was unclear what the outcomes would be. By now remote working and collaboration became the “NEW NORMAL” and companies are questioning how and when we will send people safely back to the office. At this point no one can really predict how the corporate real estate market will come out of it. The only certainty is that will not look like it was in the past.
When you look at the big tech companies like Google and Facebook that have cool offices with open spaces, designed to promote creativity and productivity among employees. They are investing in technology not only because it’s what they do, but because decisions are taken based on data and data shows that a more modern approach to the work environment with smaller more collaborative meeting rooms and open space pays off. Another big player in this market are the flexible space companies like “WEWORK” and “REGUS” that are huge tenants in the large cities markets and promote an open, more collaborative workspace.
Productivity and a better employee sentiment are not the only drivers for all that change Smart Building technologies and new “Green” regulations are also influencing how companies re-imagining their workspace. Examples are an increase on demand for PoE lighting to save cost and adhere to regulations on a reduction of carbon footprint.
What is the problem?
The challenge that most facility teams have, is to understand how employees use their office space. What is working? What is not working? What is the utilization status of my different services like quite rooms, Video Conference rooms, cafeteria, etc…
Today there is no real means to have that information without deploying a bunch of different sensors throughout different spaces and hope for the best (that the data is going to be meaningful). This task presents its own challenges like integration of systems, analytic tools that can bring some meaning to the data, also who is going to support all of that?
Teams within companies that are responsible for the workspace usually do not have an IT background, so tools that can provide all the data needed to answer some of the questions above and improve decision making are scary and sometimes ignored.
How analytics can change the game?
Data is changing everything; data is the new digital gold. So many industries are transforming with it, digital transformation is here to stay. Look at companies like Google and Facebook that became giants on Ad industry utilizing data and what they know about you to have more effective ads. Or how about Uber example, being able to surcharge for rides because they know there is a momentarily increase on demand. Don’t forget about Netflix, Amazon and so many others that use date to make decisions and have transformed the way we do things.
So why we are still in the past when it comes to data and workspaces? Do you really need to add 2000 sqft to your office for desk expansion? How many people are in and out of your office every day? Do you know what is the meeting room utilization over a day or a week? Where do employees congregate and are more productive?
Those are some questions that a lot of companies have with no answers. That is the reason why Office Dashboard was created, to help customer answer these questions.
(UPDATE) Imagine if you could tell how many employees are on the 3rd floor of your main office? How about understand where you should intensify cleaning because there is a higher concentration of people? Or just know, who was in close contact with John Doe last Friday while he was in the Office? Those are probably difficult questions if you do not have sensors and an analytic tool monitoring your office environment. This is a current example why having an Analytics Dashboard of your office space can help.
What is Office Dashboard?
Office Dashboard is a web tool for data visualization of Key Performance Indicator (KPIs) related to your workspace environment. It was built to be flexible enough to intake data from any sort of device/sensor, with pre-built integrations for your Cisco Infrastructure allowing you to leverage your network as a sensor.
Written in Python, Office Dashboard is modular allowing connection to specific cloud services for data collection, it also uses open source tools like InfluxDB and Grafana, for data retention and visualization. With this analytics dashboard companies will be able to answer questions like:
How many people I have in my workspace throughout the day?
What are the most used spaces within my office?
What is the meeting rooms utilization throughout the day?
How many people are inside a meeting room?
Office Dashboard modules
As mentioned before OFFICE Dashboard is a modular application written in Python. Below is the description of each module:
Meraki: this module is responsible to collect and receive data from the Meraki Cloud (If you need to know more about what Meraki is click here). Within the Meraki Cloud you will configure it to share location data and Camera alerts with OFFICE-DASHBOARD and will also use the API Key so Client and Network data can be collected. With the collected data you will be able to have statistics about:
Clients (Employee Devices)
Location of Clients
Image Alerts from MV cameras
Image analytics using AWS Rekognition service.
Webex: this module is responsible to collect data from Room Devices registered to WEBEX Cloud (If you need to know more about what WEBEX is click here). WEBEX Cloud works as a proxy to communicate with Video Conference devices (Room Devices) to retrieve data from them, like:
Is the video conference being used?
What are the room environmental conditions, like noise, temperature and light levels?
Is the room empty?
How many people are inside the room?
DNA Spaces: this module is responsible to collect and receive data from the DNA Spaces Cloud (If you need to know more about what DNA Spaces is click here). DNA Spaces Cloud is responsible for receiving location data from Cisco on-prem Wi-Fi and overlaying analytics on top of that data do build dashboards about your Wi-Fi utilization and location analytics. Office Dashboard collect that location information to enable similar data analysis to what is possible with Meraki location information.
Data Retention: This module is responsible to save the data on log files or the InfluxDB. For later visualization.