Grafana | ||
Screenshots | ||
LICENSE | ||
README.md | ||
Upgrade.md |
Nukiana
- this page contains an overview of importing Nuki Web + Bridge into MariaDB by parsing JSON to csv
- the Nuki Web data comes directly from web.nuki.io to [parser-node]
- the Nuki Bridge data comes fromes pulling .json data from [collector-node]
1. State of documentation
This project was born at 02.08.2018 and a first working prototype was made up for public at 09.08.2018. It is still really imperfect and not performant. But it works! I hope you will enjoy the work and maybe have ideas on how to improve the project.
All things were tested with ...
- Ubuntu 20.04.1 LTS (Linux 5.4.0-51-generic) for x86_ (checked @ 29.10.2020)
- JQ 1.6 (jq-1.6-1) (checked @ 29.10.2020)
- Grafana 7.3.0 (checked @ 29.10.2020)
- MariaDB 10.1.44 (checked @ 16.07.2020)
- Nuki Bridge Firmware 1.17.1 (checked @ 29.10.2020)
- Nuki Bridge WiFi Firmware 1.2.0 (checked @ 29.10.2020)
- Nuki Smartlock Firmware 1.11. (checked @ 29.10.2020)
- Nuki Swagger API → https://api.nuki.io (here you can find documentation on data models)
- Nuki Web → https://web.nuki.io (here you can retrieve your personal API key)
Used Documentation:
- Nuki Bridge API 1.8 (15.03.2019)
- Nuki Web API 1.1 (08.05.2018)
2. Requirements to utilize this project for your own needs
There are a few requirements you will need to fulfill to make this thing work. You need …
- a running Grafana instance with access to it
- i am using Grafana v6
- a Linux machine (virtual/bare metal) without super cow powers - this application is not cost intense
- i am using Ubuntu 18.04 LTS on an Odroid X4
- some basic knowledge about …
- accessing servers via SSH / working with console
- bash scripts
- administering MariaDB server (installation, starting, stopping, creating tables and users)
- basic knowledge about JSON format
- basic knowledge on how to work with Grafana Dashboards and Datasources
- network configuration: Port Forwarding, DynDNS, NAT traversal, SSH tunnels, iptables
3. Limitations
The current setup of this scripts is made for one Nuki Web Account which may include one or more Smartlock. In case that there are many Smartlocks under one Web Account it should grab information from all Locks automatically. If you want to monitor more than one Nuki Web Account you will need to setup the same script architecture more than once. The same is for the Nuki Bridge. Each bridge will require its own script set. BUT: You only need one database! You may use the same database for everything but if you like you also can add a datasource per Smartlock.
4. Prerequisites
4.1 JQ JSON parser
JQ is a really nice tool to parse JSON data. It also allows to export/convert data to other file formats. In this case we use the csv filter from JQ. By the way you can use https://jqplay.org/ to test different syntax without the need of a SSH shell
#Install JQ to convert nuki-web.json to nuki-web.csv
apt-get install jq
4.2 Database nuki
#Install MariaDB + Configure it
apt-get install mariadb-server
#Create a new database
mysql -u root
CREATE DATABASE nuki;
CREATE USER 'nuki'@'localhost' IDENTIFIED BY 'YOURPASSWORD';
CREATE USER 'nuki'@'%' IDENTIFIED BY 'YOURPASSWORD';
SELECT * FROM mysql.user;
USE nuki;
GRANT ALL PRIVILEGES ON `nuki`.* TO `nuki`@`localhost`;
GRANT ALL PRIVILEGES ON `nuki`.* TO `nuki`@`%`;
FLUSH PRIVILEGES;
\q
/*this table is a merge of /smartlock/; /smartlock/config/ and /smartlock/advanced/config/*/
CREATE TABLE IF NOT EXISTS Smartlock
(
`accountId` VARCHAR(9) NOT NULL,
`adminPinState` SMALLINT NOT NULL,
`advertisingMode` SMALLINT NOT NULL,
`authId` VARCHAR(24),
`autoLockTimeout` SMALLINT NOT NULL,
`autoLock` BOOLEAN NOT NULL,
`autoUnlatch` BOOLEAN NOT NULL,
`autoUpdateEnabled` BOOLEAN NOT NULL,
`automaticBatteryTypeDetection` BOOLEAN NOT NULL,
`batteryCharge` SMALLINT NOT NULL,
`batteryCharging` BOOLEAN NOT NULL,
`batteryCritical` BOOLEAN NOT NULL,
`batteryType` SMALLINT NOT NULL,
`buttonEnabled` BOOLEAN NOT NULL,
`capabilities` INT NOT NULL,
`creationDate` DATETIME NOT NULL,
`daylightSavingMode` SMALLINT NOT NULL,
`detachedCylinder` BOOLEAN NOT NULL,
`doorState` SMALLINT NOT NULL,
`doubleButtonPressAction` SMALLINT NOT NULL,
`favorite` BOOLEAN NOT NULL,
`firmwareVersion` INT NOT NULL,
`fobAction1` SMALLINT NOT NULL,
`fobAction2` SMALLINT NOT NULL,
`fobAction3` SMALLINT NOT NULL,
`fobPaired` BOOLEAN NOT NULL,
`hardwareVersion` INT NOT NULL,
`homekitState` SMALLINT NOT NULL,
`keypadPaired` BOOLEAN NOT NULL,
`lastAction` SMALLINT NOT NULL,
`latitude` FLOAT NOT NULL,
`ledBrightness` SMALLINT NOT NULL,
`ledEnabled` BOOLEAN NOT NULL,
`lngTimeout` SMALLINT NOT NULL,
`lockedPositionOffsetDegrees` SMALLINT NOT NULL,
`longitude` FLOAT NOT NULL,
`mainName` VARCHAR(255),
`mode` SMALLINT NOT NULL,
`name` VARCHAR(255),
`nightMode` BOOLEAN NOT NULL,
`operatingMode` INT NOT NULL,
`pairingEnabled` BOOLEAN NOT NULL,
`ringToOpenTimer` INT NOT NULL,
`serverState` SMALLINT NOT NULL,
`singleButtonPressAction` SMALLINT NOT NULL,
`singleLock` BOOLEAN NOT NULL,
`singleLockedPositionOffsetDegrees` SMALLINT NOT NULL,
`smartlockId` VARCHAR(9) NOT NULL PRIMARY KEY, /*primary key = unique!*/
`state` SMALLINT NOT NULL,
`timezoneId` SMALLINT NOT NULL,
`timezoneOffset` SMALLINT NOT NULL,
`totalDegrees` SMALLINT NOT NULL,
`trigger` SMALLINT NOT NULL,
`type` SMALLINT NOT NULL,
`unlatchDuration` SMALLINT NOT NULL,
`unlockedPositionOffsetDegrees` SMALLINT NOT NULL,
`unlockedToLockedTransitionOffsetDegrees` SMALLINT NOT NULL,
`updateDate` DATETIME NOT NULL
);
CREATE TABLE IF NOT EXISTS SmartlockLog
(
`accountUserId` VARCHAR(24),
`action` SMALLINT NOT NULL,
`authId` VARCHAR(24),
`autoUnlock` BOOLEAN NOT NULL,
`date` DATETIME NOT NULL,
`deviceType` BOOLEAN NOT NULL,
`id` VARCHAR(24) NOT NULL PRIMARY KEY, /*primary key = unique!*/
`name` VARCHAR(255),
`smartlockId` VARCHAR(9) NOT NULL,
`state` SMALLINT NOT NULL,
`trigger` SMALLINT NOT NULL
);
CREATE TABLE IF NOT EXISTS SmartlockAuth
(
`accountUserId` VARCHAR(24),
`allowedFromDate` DATETIME NOT NULL,
`allowedFromTime` SMALLINT,
`allowedUntilDate` DATETIME NOT NULL,
`allowedUntilTime` SMALLINT,
`allowedWeekDays` SMALLINT NOT NULL,
`authId` VARCHAR(24), /*it may be shorter -> 3 digits*/
`creationDate` DATETIME NOT NULL,
`enabled` BOOLEAN NOT NULL,
`id` VARCHAR(24) NOT NULL PRIMARY KEY, /*primary key = unique!*/
`lastActiveDate` DATETIME NOT NULL,
`lockCount` SMALLINT NOT NULL,
`name` VARCHAR(255),
`remoteAllowed` BOOLEAN NOT NULL,
`smartlockId` VARCHAR(9) NOT NULL,
`type` SMALLINT NOT NULL,
`updateDate` DATETIME NOT NULL
);
DROP FUNCTION IF EXISTS DaysFromBitMask;
@delimiter ++;
CREATE FUNCTION DaysFromBitMask (val INT)
RETURNS VARCHAR(255) DETERMINISTIC
BEGIN
DECLARE DAY_RESULT VARCHAR(255);
SET DAY_RESULT="";
IF val = 0 THEN SET DAY_RESULT='Keine'; END IF;
IF val >= 64 THEN SET DAY_RESULT='Mo'; SET val=val-64; END IF;
IF val >= 32 THEN SET DAY_RESULT=CONCAT('Di',', ',DAY_RESULT); SET val=val-32; END IF;
IF val >= 16 THEN SET DAY_RESULT=CONCAT('Mi',', ',DAY_RESULT); SET val=val-16; END IF;
IF val >= 8 THEN SET DAY_RESULT=CONCAT('Do',', ',DAY_RESULT); SET val=val-8; END IF;
IF val >= 4 THEN SET DAY_RESULT=CONCAT('Fr',', ',DAY_RESULT); SET val=val-4; END IF;
IF val >= 2 THEN SET DAY_RESULT=CONCAT('Sa',', ',DAY_RESULT); SET val=val-2; END IF;
IF val >= 1 THEN SET DAY_RESULT=CONCAT('So',', ',DAY_RESULT); SET val=val-1; END IF;
/*Wenn val nicht mit "Keine" oder mit "Mo" endet, dann das Komma+Leerzeichen abschneiden*/
SET DAY_RESULT = CASE WHEN DAY_RESULT LIKE '%, ' THEN LEFT(DAY_RESULT,length(DAY_RESULT)-2) ELSE DAY_RESULT END;
RETURN DAY_RESULT;
END
++
@delimiter ;++
GRANT EXECUTE ON FUNCTION DaysFromBitMask TO nuki;
/*Test the function with some int values via SELECT DaysFromBitMask(yourValue);*/
/*look script: you need to insert the nukiId in jq*/
CREATE TABLE IF NOT EXISTS BridgeCallbackList
(
`id` SMALLINT NOT NULL,
`nukiId` VARCHAR(9) NOT NULL,
`url` TEXT NOT NULL
);
CREATE TABLE IF NOT EXISTS BridgeInfo
(
`appVersion` VARCHAR(24),
`bridgeType` SMALLINT NOT NULL,
`currentTime` DATETIME NOT NULL,
`deviceType` INT NOT NULL,
`firmwareVersion` VARCHAR(24),
`hardwareId` VARCHAR(24),
`name` VARCHAR(255) NOT NULL,
`nukiId` VARCHAR(9) NOT NULL PRIMARY KEY, /*primary key = unique!*/
`paired` SMALLINT NOT NULL,
`rssi` SMALLINT NOT NULL,
`serverConnected` SMALLINT NOT NULL,
`serverId` VARCHAR(24) NOT NULL,
`uptime` BIGINT NOT NULL,
`wifiFirmwareVersion` VARCHAR(24)
);
CREATE TABLE IF NOT EXISTS BridgeList
(
`batteryCritical` BOOLEAN NOT NULL,
`mode` INT NOT NULL,
`name` SMALLINT NOT NULL,
`nukiId` VARCHAR(9) NOT NULL PRIMARY KEY, /*primary key = unique!*/
`state` SMALLINT NOT NULL,
`stateName` VARCHAR(24) NOT NULL,
`timestamp` DATETIME NOT NULL
);
/*look script: you need to insert the nukiId in jq*/
CREATE TABLE IF NOT EXISTS BridgeLockState
(
`batteryCritical` BOOLEAN NOT NULL,
`mode` INT NOT NULL,
`nukiId` VARCHAR(9) NOT NULL PRIMARY KEY, /*primary key = unique!*/
`state` SMALLINT NOT NULL,
`stateName` VARCHAR(24) NOT NULL,
`success` BOOLEAN NOT NULL
);
/*look script: you need to insert the bridgeId (=nukiId) in jq*/
CREATE TABLE IF NOT EXISTS BridgeLog
(
`bridgeId` VARCHAR(9) NOT NULL,
`cmdId` VARCHAR(9),
`connection` SMALLINT,
`macAddr` VARCHAR(12),
`nukiId` VARCHAR(9),
`timestamp` DATETIME NOT NULL,
`type` VARCHAR(255) NOT NULL
);
\q
5. Nuki parsing + conversion
5.1 Smartlock Data
mkdir -p /opt/nuki/web
chmod -R o-rwx /opt/nuki/ #disable access for other users except user/group
chmod -R ug+rwx /opt/nuki/
vim /opt/nuki/web/nuki_web_job.sh
/opt/nuki/web/nuki_web_job.sh
#!/bin/bash
NUKI_ROOT="/opt/nuki/web"
cd $NUKI_ROOT
API_TOKEN="YourPersonal80CharacterApiToken"
CURL_HEADER="--header 'Accept: application/json' --header 'Authorization: Bearer $API_TOKEN'"
BASE_URL="https://api.nuki.io"
JQ_CSV='(map(keys)|add|unique) as $cols|map(. as $row|$cols|map($row[.])) as $rows|$cols, $rows[]|@csv'
FILE_BASE="nuki_web"
DB_PASS="YourPassword"
DB_USER="nuki"
DB_NAME="nuki"
while [ true ]; do
eval "curl -X GET $CURL_HEADER $BASE_URL/smartlock/log?limit=0 -o "$FILE_BASE"_SmartlockLog.json"
jq -r "$JQ_CSV" "$FILE_BASE"_SmartlockLog.json > "$FILE_BASE"_SmartlockLog.csv
sed -i 's/true/1/g' "$FILE_BASE"_SmartlockLog.csv
sed -i 's/false/0/g' "$FILE_BASE"_SmartlockLog.csv
mysql -u$DB_USER -p$DB_PASS $DB_NAME -e"TRUNCATE TABLE SmartlockLog; LOAD DATA LOCAL INFILE '"$FILE_BASE"_SmartlockLog.csv' INTO TABLE SmartlockLog FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n' IGNORE 1 LINES;"
eval "curl -X GET $CURL_HEADER $BASE_URL/smartlock/auth -o "$FILE_BASE"_SmartlockAuth.json"
jq -r ".[]|{authId}+{allowedFromDate}+{allowedUntilDate}+{allowedFromTime}+{allowedUntilTime}+{allowedWeekDays}+{lastActiveDate}+{creationDate}+{updateDate}+." "$FILE_BASE"_SmartlockAuth.json | jq -s . | jq -r "$JQ_CSV" > "$FILE_BASE"_SmartlockAuth.csv
sed -i 's/true/1/g' "$FILE_BASE"_SmartlockAuth.csv
sed -i 's/false/0/g' "$FILE_BASE"_SmartlockAuth.csv
mysql -u$DB_USER -p$DB_PASS $DB_NAME -e"TRUNCATE TABLE SmartlockAuth; LOAD DATA LOCAL INFILE '"$FILE_BASE"_SmartlockAuth.csv' INTO TABLE SmartlockAuth FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n' IGNORE 1 LINES;"
eval "curl -X GET $CURL_HEADER $BASE_URL/smartlock -o "$FILE_BASE"_Smartlock.json"
jq -r ".[]|[{smartlockId}+{accountId}+{type}+{authId}+({\"mainName\":.name}|del(.name))+{favorite}+{firmwareVersion}+{hardwareVersion}+{serverState}+{adminPinState}+{creationDate}+{updateDate}+.config+.advancedConfig+.state]|$JQ_CSV" "$FILE_BASE"_Smartlock.json > "$FILE_BASE"_Smartlock.csv
sed -i 's/true/1/g' "$FILE_BASE"_Smartlock.csv
sed -i 's/false/0/g' "$FILE_BASE"_Smartlock.csv
mysql -u$DB_USER -p$DB_PASS $DB_NAME -e"TRUNCATE TABLE Smartlock; LOAD DATA LOCAL INFILE '"$FILE_BASE"_Smartlock.csv' INTO TABLE Smartlock FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n' IGNORE 1 LINES;"
sleep 30
done
chmod 770 nuki_web_job.sh
#test the script
./nuki_web_job.sh
5.2 Nuki Bridge Collector
Because my Database (Grafana Datasource) is on another network than the Nuki Bridge i needed to split up the installation. That means that i grab the json file from Nuki Bridge from a Raspberry Pi which is on the same network. By using some SSH user with public key authentication i am able to pull the data via Odroid.
5.2.1 On the collector node side
The following scripts are placed on a server which is on the same network as the Nuki Bridge. You might open the port 8080 as well via firewall but i did not want to do this because the basic Nuki token is really short and unsafe (only 6 chars). The script will collect data from bridge and save it to /opt/nuki/bridge. A rsync script will grab this data.
mkdir -p /opt/nuki/bridge
chmod -R o-rwx /opt/nuki/ #disable access for other users except user/group
vim /opt/nuki/bridge/nuki_bridge_job-collector.sh
/opt/nuki/bridge/nuki_bridge_job-collector.sh
#!/bin/bash
NUKI_ROOT="/opt/nuki/bridge"
cd $NUKI_ROOT
CURL_HEADER="--header 'Accept: application/json'"
BASE_URL="http://<YourLocalIPAddress>:8080"
FILE_BASE="nuki_bridge"
TOKEN="Your6CharacterToken"
LOG_COUNT=100
LOG_OFFSET=0
NUKIID="Your8CharacterNukiID"
while [ true ]; do
eval "curl -X GET $CURL_HEADER '$BASE_URL/info?token=$TOKEN' -o "$FILE_BASE"_info.json"
eval "curl -X GET $CURL_HEADER '$BASE_URL/list?token=$TOKEN' -o "$FILE_BASE"_list.json"
eval "curl -X GET $CURL_HEADER '$BASE_URL/lockState?token=$TOKEN&nukIid=$NUKIID' -o "$FILE_BASE"_lockState.json"
eval "curl -X GET $CURL_HEADER '$BASE_URL/callback/list?token=$TOKEN' -o "$FILE_BASE"_callback-list.json"
eval "curl -X GET $CURL_HEADER '$BASE_URL/log?token=$TOKEN&count=$LOG_COUNT&offset=$LOG_OFFSET' -o "$FILE_BASE"_log.json"
sleep 30
done
chmod 770 nuki_bridge_job-collector.sh
vim /opt/nuki/bridge/nuki-bridge-collector.service
/opt/nuki/bridge/nuki-bridge-collector.service
[Unit]
After=network.target
Description=Nuki Bridge Collector Service
[Service]
Type=simple
ExecStart=/opt/nuki/bridge/nuki_bridge_job-collector.sh
KillMode=process
Restart=on-failure
RestartSec=10s
RemainAfterExit=yes
User=root
Group=root
[Install]
WantedBy= multi-user.target
ln -sf /opt/nuki/bridge/nuki-bridge-collector.service /etc/systemd/system/nuki-bridge-collector.service
systemctl daemon-reload
systemctl enable nuki-bridge-collector.service
service nuki-bridge-collector restart && service nuki-bridge-collector status
5.2.2 SSH sync user
#[collector node]
addgroup --gid 1111 sshbackup
adduser --gecos "" --uid 1111 --gid 1111 --disabled-password sshbackup
passwd sshbackup
#from any client - check if it works + copy pub key
ssh -o PubkeyAuthentication=no sshbackup@sshbackup@sub.doma.in -p 22 #with password
ssh-copy-id -o PubkeyAuthentication=no -f -i ~/.ssh/sshbackup sshbackup@sub.doma.in -p 22
#if something files with too many bad authentications check fail2ban
fail2ban-client status sshd
sudo grep 'Ban ' /var/log/fail2ban.log*
fail2ban-client set sshd unbanip <TheIP>
#[collector node]
chsh -s /bin/bash sshbackup
passwd -d sshbackup
#[parser node]
#copy sshtunnel private key to /home/pi/.ssh/sshtunnel
chmod 600 /home/pi/.ssh/sshbackup
#check if it works with public key
ssh -i ~/.ssh/sshbackup sshbackup@sub.doma.in -p 22
5.2.3 On the parsing node side
The following scripts are installed on the same server where MariaDB is installed.
mkdir -p /opt/nuki/bridge
chmod -R o-rwx /opt/nuki/ #disable access for other users except user/group
chmod -R ug+rwx /opt/nuki/
vim /opt/nuki/bridge/nuki_bridge_job-parser.sh
/opt/nuki/bridge/nuki_bridge_job-parser.sh
#!/bin/bash
NUKI_ROOT="/opt/nuki/bridge"
cd $NUKI_ROOT
JQ_CSV='(map(keys)|add|unique) as $cols|map(. as $row|$cols|map($row[.])) as $rows|$cols, $rows[]|@csv'
FILE_BASE="nuki_bridge"
NUKIID="Your8CharacterNukiID"
DB_PASS="YourPassword"
DB_USER="nuki"
DB_NAME="nuki"
REMOTE_HOST="sub.doma.in"
REMOTE_PORT="22"
REMOTE_USER="sshbackup"
REMOTE_PRIVKEY="~/.ssh/sshbackup"
while [ true ]; do
/usr/bin/rsync -avz -e "ssh -i $REMOTE_PRIVKEY -p $REMOTE_PORT" $REMOTE_USER@$REMOTE_HOST:$NUKI_ROOT/*.json $NUKI_ROOT/
jq -r "[{bridgeType}+(.|{ids}|{hardwareId}+.[])+(.|{versions}|{appVersion}+{firmwareVersion}+{wifiFirmwareVersion}+.[])+{uptime}+{currentTime}+{serverConnected}+(.|{scanResults}|.[][])]|$JQ_CSV" "$FILE_BASE"_info.json > "$FILE_BASE"_info.csv
sed -i 's/true/1/g' "$FILE_BASE"_info.csv
sed -i 's/false/0/g' "$FILE_BASE"_info.csv
mysql -u$DB_USER -p$DB_PASS $DB_NAME -e"TRUNCATE Table BridgeInfo; LOAD DATA LOCAL INFILE '"$FILE_BASE"_info.csv' INTO TABLE BridgeInfo FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n' IGNORE 1 LINES;"
jq -r "[.[]|{nukiId}+{name}+({lastKnownState}|.[])]|$JQ_CSV" "$FILE_BASE"_list.json > "$FILE_BASE"_list.csv
sed -i 's/true/1/g' "$FILE_BASE"_list.csv
sed -i 's/false/0/g' "$FILE_BASE"_list.csv
mysql -u$DB_USER -p$DB_PASS $DB_NAME -e"TRUNCATE TABLE BridgeList; LOAD DATA LOCAL INFILE '"$FILE_BASE"_list.csv' INTO TABLE BridgeList FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n' IGNORE 1 LINES;"
jq -r "[.+{nukiId:\"$NUKIID\"}]|$JQ_CSV" "$FILE_BASE"_lockState.json > "$FILE_BASE"_lockState.csv
sed -i 's/true/1/g' "$FILE_BASE"_lockState.csv
sed -i 's/false/0/g' "$FILE_BASE"_lockState.csv
mysql -u$DB_USER -p$DB_PASS $DB_NAME -e"TRUNCATE TABLE BridgeLockState; LOAD DATA LOCAL INFILE '"$FILE_BASE"_lockState.csv' INTO TABLE BridgeLockState FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n' IGNORE 1 LINES;"
#will return "Cannot iterate over null (null)" if file is empty!
jq -r ".|{callbacks}|\"$NUKIID\" as \$a|.callbacks[].nukiId=\$a|.[]|$JQ_CSV" "$FILE_BASE"_callback-list.json > "$FILE_BASE"_callback-list.csv
mysql -u$DB_USER -p$DB_PASS $DB_NAME -e"TRUNCATE TABLE BridgeCallbackList; LOAD DATA LOCAL INFILE '"$FILE_BASE"_callback-list.csv' INTO TABLE BridgeCallbackList FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n' IGNORE 1 LINES;"
jq -r "[.[]|{timestamp}+{type}+{connection}+{nukiId}+{cmdId}+{macAddr}+{bridgeId:\"$NUKIID\"}]|$JQ_CSV" "$FILE_BASE"_log.json > "$FILE_BASE"_log.csv
mysql -u$DB_USER -p$DB_PASS $DB_NAME -e"TRUNCATE TABLE BridgeLog; LOAD DATA LOCAL INFILE '"$FILE_BASE"_log.csv' INTO TABLE BridgeLog FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n' IGNORE 1 LINES;"
sleep 30
done
chmod 770 nuki_bridge_job-parser.sh
#test the script
./nuki_bridge_job-parser.sh
5.3 Install scripts as service
vim /opt/nuki/web/nuki-web.service
/opt/nuki/web/nuki-web.service
[Unit]
After=network.target
Description=Nuki Web Service
[Service]
Type=simple
ExecStart=/opt/nuki/web/nuki_web_job.sh
KillMode=process
Restart=on-failure
RestartSec=10s
RemainAfterExit=yes
User=root
Group=root
[Install]
WantedBy= multi-user.target
ln -sf /opt/nuki/web/nuki-web.service /etc/systemd/system/nuki-web.service
systemctl daemon-reload
systemctl enable nuki-web.service
service nuki-web restart && service nuki-web status
vim /opt/nuki/bridge/nuki-bridge-parser.service
/opt/nuki/bridge/nuki-bridge-parser.service
[Unit]
After=network.target
Description=Nuki Bridge Parser Service
[Service]
Type=simple
ExecStart=/opt/nuki/bridge/nuki_bridge_job-parser.sh
KillMode=process
Restart=on-failure
RestartSec=10s
RemainAfterExit=yes
User=root
Group=root
[Install]
WantedBy= multi-user.target
ln -sf /opt/nuki/bridge/nuki-bridge-parser.service /etc/systemd/system/nuki-bridge-parser.service
systemctl daemon-reload
systemctl enable nuki-bridge-parser.service
service nuki-bridge-parser restart && service nuki-bridge-parser status
5.4 Customization parameters
- by adjusting the bash scripts you can modify the update frequency for CURLing and parsing json files (currently 30s is standard value)
- Bridge Id has to be added so the conjunction between Smartlock and Bridge can be established. This could be automated but at the moment it has to be done manually (replace in JQ parsing lines)
- you can customize offset and count for BridgeLog by redefining argument
6. Download Grafana Dashboard
https://grafana.com/dashboards/7628/ or https://gitea.fablabchemnitz.de/MarioVoigt/Nukiana/src/branch/master/Grafana/Nuki%20Smartlock%20+%20Bridge%20%28Internals%29.json
7. Remaining ToDo's
- translation
- at the moment the code is in english and the dashboard is only available in german
- possible solutions:
- add a language drop down variable to Grafana + rewrite every select so strings get automatically translated → decimal separators should be rewritten too
- add extra translation columns to MariaDB tables + some kind of translation string repository to fill the columns → could be utilized by DB triggers/routines
- rewrite curl commands so that the log files (SmartlockLog, BridgeLog) are getting streamed until the point (Id) where the last entry was loaded previously to ...
- reduce bandwith
- speed up the process
- make the "sed" script lines nicer → i guess they can be completely removed by using JQ
- BridgeLog: "handles" are not imported and are getting ignored
- Bridge provides timestamp but no timezoneOffset → SQL statements were written to use the timezoneOffset of Smartlock instead → works, but dirty
- replace TRUNCATE statements with "INSERT ON DUPLICATE KEY UPDATE" to leave old data but import new data
- to implement:
- make use of fields
accountUserId
mode
capabilities
deviceType
flagGeoFence
- build up some check script to make proof of correct count of columns and column order of the csv files
- collect data with time history (maybe with triggers):
- BridgeFirmwareHistory: BridgeId, date, BridgeFirmware, BridgeWifiFirmware,BridgeAppVersion
- SmartlockFirmwareHistory: SmartlockId,date,SmartlockFirmware
- history of rssi values
- history of state changes: locking states, pairing state
- make use of fields
- to test: this project was never tested with more than one Smartlock and more than one Bridge. We only have one! I also did not test what data output FOB will produce
- Maybe give a try to send the JSON output files directly to JQ/Filebeat → Elasticsearch → Grafana Datasource for Elasticsearch ↔ possible?