Installation de Darling sous Ubuntu afin de lancer des logiciels MacOS

132 x served & 56 x viewed

Quelques informations sur mon système :

$ uname -r
5.3.0-40-generic
$ gcc -v
Using built-in specs.
COLLECT_GCC=gcc
COLLECT_LTO_WRAPPER=/usr/lib/gcc/x86_64-linux-gnu/7/lto-wrapper
OFFLOAD_TARGET_NAMES=nvptx-none
OFFLOAD_TARGET_DEFAULT=1
Target: x86_64-linux-gnu
Configured with: ../src/configure -v --with-pkgversion='Ubuntu 7.4.0-1ubuntu1~18.04.1' --with-bugurl=file:///usr/share/doc/gcc-7/README.Bugs --enable-languages=c,ada,c++,go,brig,d,fortran,objc,obj-c++ --prefix=/usr --with-gcc-major-version-only --program-suffix=-7 --program-prefix=x86_64-linux-gnu- --enable-shared --enable-linker-build-id --libexecdir=/usr/lib --without-included-gettext --enable-threads=posix --libdir=/usr/lib --enable-nls --with-sysroot=/ --enable-clocale=gnu --enable-libstdcxx-debug --enable-libstdcxx-time=yes --with-default-libstdcxx-abi=new --enable-gnu-unique-object --disable-vtable-verify --enable-libmpx --enable-plugin --enable-default-pie --with-system-zlib --with-target-system-zlib --enable-objc-gc=auto --enable-multiarch --disable-werror --with-arch-32=i686 --with-abi=m64 --with-multilib-list=m32,m64,mx32 --enable-multilib --with-tune=generic --enable-offload-targets=nvptx-none --without-cuda-driver --enable-checking=release --build=x86_64-linux-gnu --host=x86_64-linux-gnu --target=x86_64-linux-gnu
Thread model: posix
gcc version 7.4.0 (Ubuntu 7.4.0-1ubuntu1~18.04.1) 

La première étape de l’installation :

$ sudo apt-get install cmake clang bison flex xz-utils libfuse-dev libudev-dev pkg-config libc6-dev:i386 linux-headers-generic gcc-multilib libcap2-bin libcairo2-dev libgl1-mesa-dev libtiff5-dev libfreetype6-dev libfreetype6-dev:i386 git libelf-dev libxml2-dev libegl1-mesa-dev libfontconfig1-dev libbsd-dev

Ensuite on charge les sources :

$ git clone --recursive https://github.com/darlinghq/darling.git

Après on fait le build :

$ cd darling
$ mkdir build && cd build
$ cmake ..
...
CMake Error at cmake/FindFFmpeg.cmake:86 (message):
  Could not find libavcodec or libavformat or libavutil
...

J’ai donc fait ceci :

sudo sudo apt-get install -y \
    libavformat-dev libavcodec-dev libavdevice-dev \
    libavutil-dev libswscale-dev libavresample-dev

J’ai relancé un cmake :

$ cmake ..
...
CMake Error at /usr/share/cmake-3.10/Modules/FindPackageHandleStandardArgs.cmake:137 (message):
  Could NOT find PulseAudio! (missing: PULSEAUDIO_LIBRARIES
  PULSEAUDIO_INCLUDE_DIRS)
...

J’ai donc essayé ceci :

$ sudo apt-get install pulseaudio libpulse0 
...

Mais j’avais déjà les librairies. J’ai donc fait le plan B :

$ sudo apt-get install libpulse-dev pulseaudio apulse

Je relancé un cmake pour la troisème fois :

$ cmake ..
...
CMake Error at /usr/share/cmake-3.10/Modules/FindPackageHandleStandardArgs.cmake:137 (message):
  Could NOT find GIF (missing: GIF_LIBRARY GIF_INCLUDE_DIR)
...

J’ai donc essayé ceci :

$ sudo apt-get install libgif-dev

La quatrième cmake est donc le bon :

$ cmake ..
...
$ make
...
ld: file not found: /System/Library/PrivateFrameworks/FMDB.framework/Versions/A/FMDB for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
src/external/JavaScriptCore/CMakeFiles/JavaScriptCore.dir/build.make:24779: recipe for target 'src/external/JavaScriptCore/JavaScriptCore' failed
make[2]: *** [src/external/JavaScriptCore/JavaScriptCore] Error 1
CMakeFiles/Makefile2:62019: recipe for target 'src/external/JavaScriptCore/CMakeFiles/JavaScriptCore.dir/all' failed
make[1]: *** [src/external/JavaScriptCore/CMakeFiles/JavaScriptCore.dir/all] Error 2
Makefile:129: recipe for target 'all' failed
make: *** [all] Error 2

A noter que le reste de l’installation fonctionne :

$ make lkm
...
$ sudo make lkm_install
...

Je vais refaire un test avec la commande :

$ git clone --recurse-submodules https://github.com/darlinghq/darling.git

Puis j’ai relancer le build :

$ time sudo make
...
[100%] Linking CXX shared library JavaScriptCore
ld: warning: OS dylibs should not add rpaths (linker option: -rpath) (Xcode build setting: LD_RUNPATH_SEARCH_PATHS)
ld: file not found: /System/Library/PrivateFrameworks/FMDB.framework/Versions/A/FMDB for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
src/external/JavaScriptCore/CMakeFiles/JavaScriptCore.dir/build.make:24779: recipe for target 'src/external/JavaScriptCore/JavaScriptCore' failed
make[2]: *** [src/external/JavaScriptCore/JavaScriptCore] Error 1
CMakeFiles/Makefile2:62019: recipe for target 'src/external/JavaScriptCore/CMakeFiles/JavaScriptCore.dir/all' failed
make[1]: *** [src/external/JavaScriptCore/CMakeFiles/JavaScriptCore.dir/all] Error 2
Makefile:129: recipe for target 'all' failed
make: *** [all] Error 2

real	208m3,434s
user	188m7,313s
sys	18m55,609s

Le but était de tester : https://www.rubitrack.com/ .

Sniff.

UPDATE :

J’ai finalement réussi (merci Luc) :

$ darling shell
Loaded the kernel module

$ uname -a
Darwin CYBERNEURONES 16.0.0 Darwin Kernel Version 16.0.0 x86_64

$ hdiutil attach ../../../Téléchargements/rubiTrack-5.3.1.dmg 
/Volumes/rubiTrack-5.3.1

$ cp -r /Volumes/rubiTrack-5.3.1/rubiTrack\ 5\ Pro.app /Applications/

$ /Applications/rubiTrack\ 5\ Pro.app/Contents/MacOS/rubiTrack\ 5\ Pro 
dyld: Symbol not found: _WebActionNavigationTypeKey
  Referenced from: /Applications/rubiTrack 5 Pro.app/Contents/MacOS/rubiTrack 5 Pro (which was built for Mac OS X 10.11)
  Expected in: /System/Library/Frameworks/WebKit.framework/Versions/A/WebKit
 in /Applications/rubiTrack 5 Pro.app/Contents/MacOS/rubiTrack 5 Pro
abort_with_payload: reason: Symbol not found: _WebActionNavigationTypeKey
  Referenced from: /Applications/rubiTrack 5 Pro.app/Contents/MacOS/rubiTrack 5 Pro (which was built for Mac OS X 10.11)
  Expected in: /System/Library/Frameworks/WebKit.framework/Versions/A/WebKit
 in /Applications/rubiTrack 5 Pro.app/Contents/MacOS/rubiTrack 5 Pro; code: 4
Abort trap: 6

Internet Explorer sous Ubuntu 18.04 , c’est possible ? (en mode virtualbox et non libvrt)

115 x served & 15 x viewed

Je fais donc suite à mon précédent article : http://www.cyber-neurones.org/2020/04/internet-explorer-sous-ubuntu-18-04-cest-possible-jai-pas-reussi/ . J’ai fixé un des problème ici : http://www.cyber-neurones.org/2020/04/ubuntu-18-issue-lvm2-lvmetad-service-unit-lvm2-lvmetad-socket-is-masked/ .

Maintenant j’essaye en mode virtualbox :

$ vagrant box add windows/win10-edge 'MSEdge - Win10.box'
==> box: Box file was not detected as metadata. Adding it directly...
==> box: Adding box 'windows/win10-edge' (v0) for provider: 
    box: Unpacking necessary files from: file:///datadisk/Vagrant/MSEdge%20-%20Win10.box
    box: Progress: 0% (Rate: 0/s, Estimated time remaini    box: Progress: 4% (Rate: 341M/s, Estimated time rema    box: Progress: 12% (Rate: 525M/s, Estimated time rem    box: Progress: 20% (Rate: 524M/s, Estimated time rem    box: Progress: 27% (Rate: 524M/s, Estimated time rem    box: Progress: 35% (Rate: 523M/s, Estimated time rem    box: Progress: 43% (Rate: 523M/s, Estimated time rem    box: Progress: 50% (Rate: 523M/s, Estimated time rem    box: Progress: 58% (Rate: 523M/s, Estimated time rem    box: Progress: 66% (Rate: 523M/s, Estimated time rem    box: Progress: 73% (Rate: 521M/s, Estimated time rem    box: Progress: 81% (Rate: 522M/s, Estimated time rem    box: Progress: 88% (Rate: 521M/s, Estimated time rem    box: Progress: 96% (Rate: 520M/s, Estimated time rem==> box: Successfully added box 'windows/win10-edge' (v0) for 'virtualbox'!

$ vagrant up --no-destroy-on-error
Bringing machine 'default' up with 'virtualbox' provider...
==> default: Importing base box 'windows/win10-edge'...
==> default: Matching MAC address for NAT networking...
==> default: Setting the name of the VM: Vagrant_default_1587474217541_88035
Vagrant is currently configured to create VirtualBox synced folders with
the `SharedFoldersEnableSymlinksCreate` option enabled. If the Vagrant
guest is not trusted, you may want to disable this option. For more
information on this option, please refer to the VirtualBox manual:

  https://www.virtualbox.org/manual/ch04.html#sharedfolders

This option can be disabled globally with an environment variable:

  VAGRANT_DISABLE_VBOXSYMLINKCREATE=1

or on a per folder basis within the Vagrantfile:

  config.vm.synced_folder '/host/path', '/guest/path', SharedFoldersEnableSymlinksCreate: false
==> default: Clearing any previously set network interfaces...
==> default: Preparing network interfaces based on configuration...
    default: Adapter 1: nat
==> default: Forwarding ports...
    default: 22 (guest) => 2222 (host) (adapter 1)
==> default: Booting VM...
==> default: Waiting for machine to boot. This may take a few minutes...
    default: SSH address: 127.0.0.1:2222
    default: SSH username: ....
    default: SSH auth method: password


Ensuite il faut lancer VirtualBox :

Et on a ceci :

Le mot de passe est : Passw0rd! ce qui donne Pqssz)rd/ ou encore Pqsszàrd1.

Ensuite il est préférable de passer en Francais :

Ubuntu 18 : Issue : lvm2-lvmetad.service: Unit lvm2-lvmetad.socket is masked.

125 x served & 46 x viewed

L’erreur complète que j’avais :

Paramétrage de lvm2 (2.02.176-4.1ubuntu3.18.04.2) ...
update-initramfs: deferring update (trigger activated)
insserv: warning: script 'douane' missing LSB tags and overrides
insserv: warning: script 'douane' missing LSB tags and overrides
Failed to restart lvm2-lvmetad.service: Unit lvm2-lvmetad.socket is masked.
invoke-rc.d: initscript lvm2-lvmetad, action "restart" failed.
● lvm2-lvmetad.service - LVM2 metadata daemon
   Loaded: loaded (/lib/systemd/system/lvm2-lvmetad.service; static; vendor preset: enabled)
   Active: inactive (dead)
     Docs: man:lvmetad(8)
dpkg: erreur de traitement du paquet lvm2 (--configure) :
 installed lvm2 package post-installation script subprocess returned error exit status 1
dpkg: des problèmes de dépendances empêchent la configuration de libguestfs0:amd64 :
 libguestfs0:amd64 dépend de lvm2 ; cependant :
 Le paquet lvm2 n'est pas encore configuré.

Pour fixer le problème j’ai du faire :

sudo apt-get purge lvm2
sudo apt autoremove
sudo apt install lvm2

 

ENEDIS : Migration des données pour une utilisation dans Mariadb / Grafana (fait en Python)

168 x served & 103 x viewed

Je viens de faire un nouveau programme en Python afin de mettre les données de ENEDIS sur MariaDB & Python.
Pour avoir les données de ENEDIS il faut aller sur https://mon-compte-particulier.enedis.fr/home-connectee/ et se faire un compte. Puis relier ce compte à la facture EDF … Je vais pas vous mentir c’est un peu de parcours du combattant. J’ai du faire appel à plusieurs fois au support afin que le lien puisse se faire. Misère.

Pour mieux comprendre les donnéesil faut lire : https://espace-client-particuliers.enedis.fr/documents/18080/5456906/pdf-producteurSuiviProduction/ebd9e049-5fd1-4769-9f87-b63e8c4b051c

EAS F1 à EAS F10 : le compteur Linkypermet d’avoir jusqu’à 10 index de soutirage (à chaque index correspond un poste tarifaire de l’offre de votre Fournisseur)

EAS D1 à EAS D4 : 4 index de soutirage (calendrier Distributeur pour facturation de l’acheminement)

EAS T: Index Totalisateur du soutirage. Cet index sert à vérifier la cohérence entre la consommation affichée de la grille fournisseur et la consommation de la grille distributeur

J’ai fait cela sous Ubuntu mais Python fonctionne très bien sous Windows, MacOS, …

Il faut donc :

  • Python.
  • MariaDB (ou MySQL) (Il est très simple de modifier le code pour envoyer vers une autre destination)
  • Grafana.

Un petit rappel sur l’ajout de database et user sur MariaDB/MySQL :

$ sudo mysql -u root 
[sudo] password for XXXX: 
Welcome to the MariaDB monitor.  Commands end with ; or \g.
Your MariaDB connection id is 273026
Server version: 10.1.44-MariaDB-0ubuntu0.18.04.1 Ubuntu 18.04

Copyright (c) 2000, 2018, Oracle, MariaDB Corporation Ab and others.

Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

MariaDB [(none)]> create database ENEDIS;
Query OK, 1 row affected (0.00 sec)

MariaDB [(none)]> CREATE USER 'enedis'@'localhost' IDENTIFIED BY 'enedis';
Query OK, 0 rows affected (0.01 sec)

MariaDB [(none)]> GRANT ALL PRIVILEGES ON ENEDIS.* TO 'enedis'@'localhost';
Query OK, 0 rows affected (0.00 sec)

MariaDB [(none)]> FLUSH PRIVILEGES;
Query OK, 0 rows affected (0.00 sec)

MariaDB [(none)]> \quit
Bye

Ensuite il faut faire le lien avec Grafana :

Voici le programme en Python ( La version 1 , que je vais améliorer par la suite ). A noter que vous devez mettre le path complet de votre fichier à la place de Enedis_Conso_Jour_XXXXX-XXXX_YYYYYY.csv.

Les sources sont disponibles ici : https://github.com/farias06/Grafana/blob/master/ENEDIS_CSV_insert.py

#! /usr/bin/env python3
# -*-coding:Latin-1 -* 

# @author <@cyber-neurones.org>

# Version 1 

import csv
from datetime import datetime
import mysql.connector
import re
from mysql.connector import errorcode
from mysql.connector import (connection)
#import numpy as np

def days_between(d1, d2):
    d1 = datetime.strptime(d1, "%Y-%m-%d %H:%M:%S")
    d2 = datetime.strptime(d2, "%Y-%m-%d %H:%M:%S")
    return abs((d2 - d1).days)

def clean_tab(d):
     if d != "":
         return int(d);
     else:
         return 0

cnx = connection.MySQLConnection(user='enedis', password='enedis',
                                 host='127.0.0.1',
                                 database='ENEDIS')
cursor = cnx.cursor();
now = datetime.now().date();

#cursor.execute("DROP TABLE COMPTEUR;");
#cursor.execute("CREATE TABLE COMPTEUR (DATE datetime,TYPE_RELEVE varchar(50),EAS_F1 int, EAS_F2 int, EAS_F3 int , EAS_F4 int, EAS_F5 int, EAS_F6 int , EAS_F7 int, EAS_F8 int, EAS_F9 int, EAS_F10 int, EAS_D1 int, EAS_D2 int, EAS_D3 int,EAS_D4 int, EAS_T  int );");
cursor.execute("DELETE FROM COMPTEUR");
cnx.commit();

MyType_Previous = "None";
MyEAS_F1_Previous = 0;
MyEAS_F1 = 0
Diff_EAS_T_int = 0

with open('Enedis_Conso_Jour_XXXXX-XXXX_YYYYYY.csv', 'r') as csvfile:
    reader = csv.reader(csvfile, delimiter=';')
    for row in reader:
        Nb = len(row);
        #row.replace(np.nan, 0)
        #print ("Nb:"+str(Nb));
        if (Nb == 17):
            MyDate=row[0].replace("+02:00", "")
            MyDate=MyDate.replace("T", " ")
            MyDate=MyDate.replace("+01:00", "")
            MyType=row[1].replace("'", " ")
            if (MyType == "Arrêté quotidien"):
                MyEAS_F1=clean_tab(row[2])
                MyEAS_F2=clean_tab(row[3])
                MyEAS_F3=clean_tab(row[4])
                MyEAS_F4=clean_tab(row[5])
                MyEAS_F5=clean_tab(row[6])
                MyEAS_F6=clean_tab(row[7])
                MyEAS_F7=clean_tab(row[8])
                MyEAS_F8=clean_tab(row[9])
                MyEAS_F9=clean_tab(row[10])
                MyEAS_F10=clean_tab(row[11])
                MyEAS_D1=clean_tab(row[12])
                MyEAS_D2=clean_tab(row[13])
                MyEAS_D3=clean_tab(row[14])
                MyEAS_D4=clean_tab(row[15])
                MyEAS_T=clean_tab(row[16])

            if (MyType_Previous == MyType):
                #print(MyType_Previous+"/"+MyType);
                Day=days_between(MyDate,MyDate_Previous);
                #print("Diff in days"+str(Day));
            else:
                Day = 0    

            if (Day == 1):
                Diff_EAS_F1 = str(MyEAS_F1-MyEAS_F1_Previous);
                Diff_EAS_F2 = str(MyEAS_F2-MyEAS_F2_Previous);
                Diff_EAS_F3 = str(MyEAS_F3-MyEAS_F3_Previous);
                Diff_EAS_F4 = str(MyEAS_F4-MyEAS_F4_Previous);
                Diff_EAS_F5 = str(MyEAS_F5-MyEAS_F5_Previous);
                Diff_EAS_F6 = str(MyEAS_F6-MyEAS_F6_Previous);
                Diff_EAS_F7 = str(MyEAS_F7-MyEAS_F7_Previous);
                Diff_EAS_F8 = str(MyEAS_F8-MyEAS_F8_Previous);
                Diff_EAS_F9 = str(MyEAS_F9-MyEAS_F9_Previous);
                Diff_EAS_F10 = str(MyEAS_F10-MyEAS_F10_Previous);
                Diff_EAS_D1 = str(MyEAS_D1-MyEAS_D1_Previous);
                Diff_EAS_D2 = str(MyEAS_D2-MyEAS_D2_Previous);
                Diff_EAS_D3 = str(MyEAS_D3-MyEAS_D3_Previous);
                Diff_EAS_D4 = str(MyEAS_D4-MyEAS_D4_Previous);
                Diff_EAS_T_int = (MyEAS_T-MyEAS_T_Previous)/Day;
                Diff_EAS_T = str(Diff_EAS_T_int);

                if ((MyType == "Arrêté quotidien") and (Diff_EAS_T_int > 0)):
                    try :
                        Requesq_SQL="INSERT INTO COMPTEUR (DATE,TYPE_RELEVE,EAS_F1, EAS_F2, EAS_F3 , EAS_F4, EAS_F5, EAS_F6 , EAS_F7 , EAS_F8 , EAS_F9 , EAS_F10 , EAS_D1 , EAS_D2 , EAS_D3 ,EAS_D4 , EAS_T) VALUES ('"+MyDate+"', '"+MyType+"', "+Diff_EAS_F1+","+Diff_EAS_F2+", "+Diff_EAS_F3+", "+Diff_EAS_F4+", "+Diff_EAS_F5+", "+Diff_EAS_F6+", "+Diff_EAS_F7+","+Diff_EAS_F8+", "+Diff_EAS_F9+", "+Diff_EAS_F10+","+Diff_EAS_D1+","+Diff_EAS_D2+","+Diff_EAS_D3+","+Diff_EAS_D4+","+Diff_EAS_T+");";
                        #print Requesq_SQL;
                        cursor.execute(Requesq_SQL);
                    except mysql.connector.Error as err:
                        print("Something went wrong: {}".format(err))
                        if err.errno == errorcode.ER_BAD_TABLE_ERROR:
                            print("Creating table COMPTEUR")
                        else:
                            None

            if (Day > 1):
                print ("Day > 1 :"+str(Day)) 
                Diff_EAS_F1 = str((MyEAS_F1-MyEAS_F1_Previous)/Day);
                Diff_EAS_F2 = str((MyEAS_F2-MyEAS_F2_Previous)/Day);
                Diff_EAS_F3 = str((MyEAS_F3-MyEAS_F3_Previous)/Day);
                Diff_EAS_F4 = str((MyEAS_F4-MyEAS_F4_Previous)/Day);
                Diff_EAS_F5 = str((MyEAS_F5-MyEAS_F5_Previous)/Day);
                Diff_EAS_F6 = str((MyEAS_F6-MyEAS_F6_Previous)/Day);
                Diff_EAS_F7 = str((MyEAS_F7-MyEAS_F7_Previous)/Day);
                Diff_EAS_F8 = str((MyEAS_F8-MyEAS_F8_Previous)/Day);
                Diff_EAS_F9 = str((MyEAS_F9-MyEAS_F9_Previous)/Day);
                Diff_EAS_F10 = str((MyEAS_F10-MyEAS_F10_Previous)/Day);
                Diff_EAS_D1 = str((MyEAS_D1-MyEAS_D1_Previous)/Day);
                Diff_EAS_D2 = str((MyEAS_D2-MyEAS_D2_Previous)/Day);
                Diff_EAS_D3 = str((MyEAS_D3-MyEAS_D3_Previous)/Day);
                Diff_EAS_D4 = str((MyEAS_D4-MyEAS_D4_Previous)/Day);
                Diff_EAS_T_int = (MyEAS_T-MyEAS_T_Previous)/Day;
                Diff_EAS_T = str(Diff_EAS_T_int);

                if ((MyType == "Arrêté quotidien") and (Diff_EAS_T_int > 0)):
                    try :
                        Requesq_SQL="INSERT INTO COMPTEUR (DATE,TYPE_RELEVE,EAS_F1, EAS_F2, EAS_F3 , EAS_F4, EAS_F5, EAS_F6 , EAS_F7 , EAS_F8 , EAS_F9 , EAS_F10 , EAS_D1 , EAS_D2 , EAS_D3 ,EAS_D4 , EAS_T) VALUES ('"+MyDate+"', '"+MyType+"', "+Diff_EAS_F1+","+Diff_EAS_F2+", "+Diff_EAS_F3+", "+Diff_EAS_F4+", "+Diff_EAS_F5+", "+Diff_EAS_F6+", "+Diff_EAS_F7+","+Diff_EAS_F8+", "+Diff_EAS_F9+", "+Diff_EAS_F10+","+Diff_EAS_D1+","+Diff_EAS_D2+","+Diff_EAS_D3+","+Diff_EAS_D4+","+Diff_EAS_T+");";
                        print Requesq_SQL;
                        cursor.execute(Requesq_SQL);
                    except mysql.connector.Error as err:
                        print("Something went wrong: {}".format(err))
                        if err.errno == errorcode.ER_BAD_TABLE_ERROR:
                            print("Creating table COMPTEUR")
                        else:
                            None

            # Save Previous
            if ((MyType == "Arrêté quotidien") and (Diff_EAS_T_int >= 0)):
                MyDate_Previous=MyDate;
                MyType_Previous=MyType;
                MyEAS_F1_Previous=MyEAS_F1;
                MyEAS_F2_Previous=MyEAS_F2;
                MyEAS_F3_Previous=MyEAS_F3;
                MyEAS_F4_Previous=MyEAS_F4;
                MyEAS_F5_Previous=MyEAS_F5;
                MyEAS_F6_Previous=MyEAS_F6;
                MyEAS_F7_Previous=MyEAS_F7;
                MyEAS_F8_Previous=MyEAS_F8;
                MyEAS_F9_Previous=MyEAS_F9;
                MyEAS_F10_Previous=MyEAS_F10;
                MyEAS_D1_Previous=MyEAS_D1;
                MyEAS_D2_Previous=MyEAS_D2;
                MyEAS_D3_Previous=MyEAS_D3;
                MyEAS_D4_Previous=MyEAS_D4;
                MyEAS_T_Previous=MyEAS_T;


cnx.commit();
cursor.close();
cnx.close();

# END 

Ensuite on passe à la visualisation graphique :

  • Voir la consommation totale :
SELECT
  UNIX_TIMESTAMP(date) as time_sec,
  EAS_T as value,
  "TOTAL" as metric
FROM COMPTEUR
WHERE $__timeFilter(date)
ORDER BY date ASC

Ensuite les autres graphiques sont fonctions du forfait … pour ma part j’ai EAS D1 (Heures pleines):

SELECT
  UNIX_TIMESTAMP(date) as time_sec,
  EAS_D1 as value,
  "Heures pleines" as metric
FROM COMPTEUR
WHERE $__timeFilter(date)
ORDER BY date ASC

Et aussi EAS D2 (Nuit) :

SELECT
UNIX_TIMESTAMP(date) as time_sec,
EAS_D2 as value,
"Heures creuses" as metric
FROM COMPTEUR
WHERE $__timeFilter(date)
ORDER BY date ASC

Je vais améliorer les versions patiences …

Slack : Migration des données de connexion vers MariaDB pour une utilisation dans Grafana

148 x served & 16 x viewed

Pour utiliser le script il faut:

  • MariaDB
  • Python
  • Grafana.

Slack permet le téléchargement d’un fichier CSV ( access_logs.csv ), dont les données sont les suivantes :

  • Date Accessed,
  • User Agent – Simple,
  • User Agent – Full,
  • IP Address,
  • Number of Logins,
  • Last Date Accessed

Petit rappel sur l’ajout d’une database et d’un utilisateur :

$ sudo mysql -u root

MariaDB [(none)]> create database SLACK;

MariaDB [(none)]> CREATE USER 'slack'@'localhost' IDENTIFIED BY 'slack';

MariaDB [(none)]> GRANT ALL PRIVILEGES ON SLACK.* TO 'slack'@'localhost';

MariaDB [(none)]> FLUSH PRIVILEGES;

MariaDB [(none)]> \quit
Bye

Petit rappel aussi en python pour télécharger une classe non disponible :

$ sudo pip install python-dateutil

Le source du programme : ( Les sources sont disponibles ici : https://github.com/farias06/Grafana/blob/master/Slack_CSV_insert.py )

#! /usr/bin/env python3
# ~*~ utf-8 ~*~

import csv
from datetime import datetime
from dateutil.parser import parse
import mysql.connector
from mysql.connector import errorcode
from mysql.connector import (connection)


cnx = connection.MySQLConnection(user='slack', password='slack',
                                 host='127.0.0.1',
                                 database='SLACK')
cursor = cnx.cursor();
now = datetime.now().date();

#cursor.execute("DROP TABLE SLACK;");
#cursor.execute("CREATE TABLE SLACK (DATE datetime, DATE_LAST datetime, USER_AGENT varchar(50),USER_AGENT_FULL varchar(256), IP varchar(26), NUMBER int);");
cursor.execute("DELETE FROM SLACK");
cnx.commit();

with open('access_logs.csv', 'r') as csvfile:
    reader = csv.reader(csvfile, quotechar='"')
    for row in reader:
        MyDate=row[0];
        MyDate = MyDate.rsplit('(',1)[0];
        if (MyDate == "Date Accessed"):
           print("No");
        else:
           Dt = parse(MyDate)
           MyUser=row[1];
           MyUser=MyUser.replace("'", " ")
           MyUserFull=row[2];
           MyUserFull=MyUserFull.replace("'", " ")
           MyIP=row[3];
           MyNumber=row[4];
           MyDateLast=row[5];
           MyDateLast = MyDateLast.rsplit('(',1)[0];
           DtLast = parse(MyDateLast)
           try :
              SQLREQUEST = "INSERT INTO SLACK (DATE, USER_AGENT, USER_AGENT_FULL, IP, DATE_LAST, NUMBER) VALUES ('"+str(Dt.date())+" "+str(Dt.time())+"', '"+MyUser+"', '"+MyUserFull+"','"+MyIP+"', '"+str(DtLast.date())+" "+str(DtLast.time())+"', "+MyNumber+" );";
              cursor.execute(SQLREQUEST);
           except mysql.connector.Error as err:
              print("Something went wrong: {}".format(err))
              if err.errno == errorcode.ER_BAD_TABLE_ERROR:
                 print("Creating table SLACK")
              else:
                 None

cnx.commit();
cursor.close();
cnx.close();

# END 

Pour lancer le programme :

$ python Slack_CSV_insert.py

Ensuite pour voir les données il y a plusieurs requetes possibles pour le metric :

Par IP :

SELECT
  UNIX_TIMESTAMP(date) as time_sec,
  SUM(number) as value,
  ip as metric
FROM SLACK
WHERE $__timeFilter(date)
GROUP BY day(date),month(date),year(date)
ORDER BY date ASC

Par User Agent :

SELECT
  UNIX_TIMESTAMP(date) as time_sec,
  SUM(number) as value,
  user_agent as metric
FROM SLACK
WHERE $__timeFilter(date)
GROUP BY day(date),month(date),year(date)
ORDER BY date ASC

Par User Agent Full :

SELECT
  UNIX_TIMESTAMP(date) as time_sec,
  SUM(number) as value,
  user_agent_full as metric
FROM SLACK
WHERE $__timeFilter(date)
GROUP BY day(date),month(date),year(date)
ORDER BY date ASC

J’ai noté un bug, j’utilise la version Desktop sous Linux et je n’ai pas de « Application de Bureau Linux » .