ENEDIS : Migration des données pour une utilisation dans Mariadb / Grafana (fait en Python)

Je viens de faire un nouveau programme en Python afin de mettre les données de ENEDIS sur MariaDB & Python.
Pour avoir les données de ENEDIS il faut aller sur https://mon-compte-particulier.enedis.fr/home-connectee/ et se faire un compte. Puis relier ce compte à la facture EDF … Je vais pas vous mentir c’est un peu de parcours du combattant. J’ai du faire appel à plusieurs fois au support afin que le lien puisse se faire. Misère.

Pour mieux comprendre les donnéesil faut lire : https://espace-client-particuliers.enedis.fr/documents/18080/5456906/pdf-producteurSuiviProduction/ebd9e049-5fd1-4769-9f87-b63e8c4b051c

EAS F1 à EAS F10 : le compteur Linky permet d’avoir jusqu’à 10 index de soutirage (à chaque index correspond un poste tarifaire de l’offre de votre Fournisseur)

EAS D1 à EAS D4 : 4 index de soutirage (calendrier Distributeur pour facturation de l’acheminement)

EAS T: Index Totalisateur du soutirage. Cet index sert à vérifier la cohérence entre la consommation affichée de la grille fournisseur et la consommation de la grille distributeur

J’ai fait cela sous Ubuntu mais Python fonctionne très bien sous Windows, MacOS, …

Il faut donc :

  • Python.
  • MariaDB (ou MySQL) (Il est très simple de modifier le code pour envoyer vers une autre destination)
  • Grafana.

Un petit rappel sur l’ajout de database et user sur MariaDB/MySQL :

$ sudo mysql -u root 
[sudo] password for XXXX: 
Welcome to the MariaDB monitor.  Commands end with ; or \g.
Your MariaDB connection id is 273026
Server version: 10.1.44-MariaDB-0ubuntu0.18.04.1 Ubuntu 18.04

Copyright (c) 2000, 2018, Oracle, MariaDB Corporation Ab and others.

Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

MariaDB [(none)]> create database ENEDIS;
Query OK, 1 row affected (0.00 sec)

MariaDB [(none)]> CREATE USER 'enedis'@'localhost' IDENTIFIED BY 'enedis';
Query OK, 0 rows affected (0.01 sec)

MariaDB [(none)]> GRANT ALL PRIVILEGES ON ENEDIS.* TO 'enedis'@'localhost';
Query OK, 0 rows affected (0.00 sec)

MariaDB [(none)]> FLUSH PRIVILEGES;
Query OK, 0 rows affected (0.00 sec)

MariaDB [(none)]> \quit
Bye

Ensuite il faut faire le lien avec Grafana :

Voici le programme en Python ( La version 1 , que je vais améliorer par la suite ). A noter que vous devez mettre le path complet de votre fichier à la place de Enedis_Conso_Jour_XXXXX-XXXX_YYYYYY.csv.

Les sources sont disponibles ici : https://github.com/farias06/Grafana/blob/master/ENEDIS_CSV_insert.py

#! /usr/bin/env python3
# -*-coding:Latin-1 -* 

# @author <@cyber-neurones.org>

# Version 1 

import csv
from datetime import datetime
import mysql.connector
import re
from mysql.connector import errorcode
from mysql.connector import (connection)
#import numpy as np

def days_between(d1, d2):
    d1 = datetime.strptime(d1, "%Y-%m-%d %H:%M:%S")
    d2 = datetime.strptime(d2, "%Y-%m-%d %H:%M:%S")
    return abs((d2 - d1).days)

def clean_tab(d):
     if d != "":
         return int(d);
     else:
         return 0

cnx = connection.MySQLConnection(user='enedis', password='enedis',
                                 host='127.0.0.1',
                                 database='ENEDIS')
cursor = cnx.cursor();
now = datetime.now().date();

#cursor.execute("DROP TABLE COMPTEUR;");
#cursor.execute("CREATE TABLE COMPTEUR (DATE datetime,TYPE_RELEVE varchar(50),EAS_F1 int, EAS_F2 int, EAS_F3 int , EAS_F4 int, EAS_F5 int, EAS_F6 int , EAS_F7 int, EAS_F8 int, EAS_F9 int, EAS_F10 int, EAS_D1 int, EAS_D2 int, EAS_D3 int,EAS_D4 int, EAS_T  int );");
cursor.execute("DELETE FROM COMPTEUR");
cnx.commit();

MyType_Previous = "None";
MyEAS_F1_Previous = 0;
MyEAS_F1 = 0
Diff_EAS_T_int = 0

with open('Enedis_Conso_Jour_XXXXX-XXXX_YYYYYY.csv', 'r') as csvfile:
    reader = csv.reader(csvfile, delimiter=';')
    for row in reader:
        Nb = len(row);
        #row.replace(np.nan, 0)
        #print ("Nb:"+str(Nb));
        if (Nb == 17):
            MyDate=row[0].replace("+02:00", "")
            MyDate=MyDate.replace("T", " ")
            MyDate=MyDate.replace("+01:00", "")
            MyType=row[1].replace("'", " ")
            if (MyType == "Arrêté quotidien"):
                MyEAS_F1=clean_tab(row[2])
                MyEAS_F2=clean_tab(row[3])
                MyEAS_F3=clean_tab(row[4])
                MyEAS_F4=clean_tab(row[5])
                MyEAS_F5=clean_tab(row[6])
                MyEAS_F6=clean_tab(row[7])
                MyEAS_F7=clean_tab(row[8])
                MyEAS_F8=clean_tab(row[9])
                MyEAS_F9=clean_tab(row[10])
                MyEAS_F10=clean_tab(row[11])
                MyEAS_D1=clean_tab(row[12])
                MyEAS_D2=clean_tab(row[13])
                MyEAS_D3=clean_tab(row[14])
                MyEAS_D4=clean_tab(row[15])
                MyEAS_T=clean_tab(row[16])

            if (MyType_Previous == MyType):
                #print(MyType_Previous+"/"+MyType);
                Day=days_between(MyDate,MyDate_Previous);
                #print("Diff in days"+str(Day));
            else:
                Day = 0    

            if (Day == 1):
                Diff_EAS_F1 = str(MyEAS_F1-MyEAS_F1_Previous);
                Diff_EAS_F2 = str(MyEAS_F2-MyEAS_F2_Previous);
                Diff_EAS_F3 = str(MyEAS_F3-MyEAS_F3_Previous);
                Diff_EAS_F4 = str(MyEAS_F4-MyEAS_F4_Previous);
                Diff_EAS_F5 = str(MyEAS_F5-MyEAS_F5_Previous);
                Diff_EAS_F6 = str(MyEAS_F6-MyEAS_F6_Previous);
                Diff_EAS_F7 = str(MyEAS_F7-MyEAS_F7_Previous);
                Diff_EAS_F8 = str(MyEAS_F8-MyEAS_F8_Previous);
                Diff_EAS_F9 = str(MyEAS_F9-MyEAS_F9_Previous);
                Diff_EAS_F10 = str(MyEAS_F10-MyEAS_F10_Previous);
                Diff_EAS_D1 = str(MyEAS_D1-MyEAS_D1_Previous);
                Diff_EAS_D2 = str(MyEAS_D2-MyEAS_D2_Previous);
                Diff_EAS_D3 = str(MyEAS_D3-MyEAS_D3_Previous);
                Diff_EAS_D4 = str(MyEAS_D4-MyEAS_D4_Previous);
                Diff_EAS_T_int = (MyEAS_T-MyEAS_T_Previous)/Day;
                Diff_EAS_T = str(Diff_EAS_T_int);

                if ((MyType == "Arrêté quotidien") and (Diff_EAS_T_int > 0)):
                    try :
                        Requesq_SQL="INSERT INTO COMPTEUR (DATE,TYPE_RELEVE,EAS_F1, EAS_F2, EAS_F3 , EAS_F4, EAS_F5, EAS_F6 , EAS_F7 , EAS_F8 , EAS_F9 , EAS_F10 , EAS_D1 , EAS_D2 , EAS_D3 ,EAS_D4 , EAS_T) VALUES ('"+MyDate+"', '"+MyType+"', "+Diff_EAS_F1+","+Diff_EAS_F2+", "+Diff_EAS_F3+", "+Diff_EAS_F4+", "+Diff_EAS_F5+", "+Diff_EAS_F6+", "+Diff_EAS_F7+","+Diff_EAS_F8+", "+Diff_EAS_F9+", "+Diff_EAS_F10+","+Diff_EAS_D1+","+Diff_EAS_D2+","+Diff_EAS_D3+","+Diff_EAS_D4+","+Diff_EAS_T+");";
                        #print Requesq_SQL;
                        cursor.execute(Requesq_SQL);
                    except mysql.connector.Error as err:
                        print("Something went wrong: {}".format(err))
                        if err.errno == errorcode.ER_BAD_TABLE_ERROR:
                            print("Creating table COMPTEUR")
                        else:
                            None

            if (Day > 1):
                print ("Day > 1 :"+str(Day)) 
                Diff_EAS_F1 = str((MyEAS_F1-MyEAS_F1_Previous)/Day);
                Diff_EAS_F2 = str((MyEAS_F2-MyEAS_F2_Previous)/Day);
                Diff_EAS_F3 = str((MyEAS_F3-MyEAS_F3_Previous)/Day);
                Diff_EAS_F4 = str((MyEAS_F4-MyEAS_F4_Previous)/Day);
                Diff_EAS_F5 = str((MyEAS_F5-MyEAS_F5_Previous)/Day);
                Diff_EAS_F6 = str((MyEAS_F6-MyEAS_F6_Previous)/Day);
                Diff_EAS_F7 = str((MyEAS_F7-MyEAS_F7_Previous)/Day);
                Diff_EAS_F8 = str((MyEAS_F8-MyEAS_F8_Previous)/Day);
                Diff_EAS_F9 = str((MyEAS_F9-MyEAS_F9_Previous)/Day);
                Diff_EAS_F10 = str((MyEAS_F10-MyEAS_F10_Previous)/Day);
                Diff_EAS_D1 = str((MyEAS_D1-MyEAS_D1_Previous)/Day);
                Diff_EAS_D2 = str((MyEAS_D2-MyEAS_D2_Previous)/Day);
                Diff_EAS_D3 = str((MyEAS_D3-MyEAS_D3_Previous)/Day);
                Diff_EAS_D4 = str((MyEAS_D4-MyEAS_D4_Previous)/Day);
                Diff_EAS_T_int = (MyEAS_T-MyEAS_T_Previous)/Day;
                Diff_EAS_T = str(Diff_EAS_T_int);

                if ((MyType == "Arrêté quotidien") and (Diff_EAS_T_int > 0)):
                    try :
                        Requesq_SQL="INSERT INTO COMPTEUR (DATE,TYPE_RELEVE,EAS_F1, EAS_F2, EAS_F3 , EAS_F4, EAS_F5, EAS_F6 , EAS_F7 , EAS_F8 , EAS_F9 , EAS_F10 , EAS_D1 , EAS_D2 , EAS_D3 ,EAS_D4 , EAS_T) VALUES ('"+MyDate+"', '"+MyType+"', "+Diff_EAS_F1+","+Diff_EAS_F2+", "+Diff_EAS_F3+", "+Diff_EAS_F4+", "+Diff_EAS_F5+", "+Diff_EAS_F6+", "+Diff_EAS_F7+","+Diff_EAS_F8+", "+Diff_EAS_F9+", "+Diff_EAS_F10+","+Diff_EAS_D1+","+Diff_EAS_D2+","+Diff_EAS_D3+","+Diff_EAS_D4+","+Diff_EAS_T+");";
                        print Requesq_SQL;
                        cursor.execute(Requesq_SQL);
                    except mysql.connector.Error as err:
                        print("Something went wrong: {}".format(err))
                        if err.errno == errorcode.ER_BAD_TABLE_ERROR:
                            print("Creating table COMPTEUR")
                        else:
                            None

            # Save Previous
            if ((MyType == "Arrêté quotidien") and (Diff_EAS_T_int >= 0)):
                MyDate_Previous=MyDate;
                MyType_Previous=MyType;
                MyEAS_F1_Previous=MyEAS_F1;
                MyEAS_F2_Previous=MyEAS_F2;
                MyEAS_F3_Previous=MyEAS_F3;
                MyEAS_F4_Previous=MyEAS_F4;
                MyEAS_F5_Previous=MyEAS_F5;
                MyEAS_F6_Previous=MyEAS_F6;
                MyEAS_F7_Previous=MyEAS_F7;
                MyEAS_F8_Previous=MyEAS_F8;
                MyEAS_F9_Previous=MyEAS_F9;
                MyEAS_F10_Previous=MyEAS_F10;
                MyEAS_D1_Previous=MyEAS_D1;
                MyEAS_D2_Previous=MyEAS_D2;
                MyEAS_D3_Previous=MyEAS_D3;
                MyEAS_D4_Previous=MyEAS_D4;
                MyEAS_T_Previous=MyEAS_T;


cnx.commit();
cursor.close();
cnx.close();

# END 

Ensuite on passe à la visualisation graphique :

  • Voir la consommation totale :
SELECT
  UNIX_TIMESTAMP(date) as time_sec,
  EAS_T as value,
  "TOTAL" as metric
FROM COMPTEUR
WHERE $__timeFilter(date)
ORDER BY date ASC

Ensuite les autres graphiques sont fonctions du forfait … pour ma part j’ai EAS D1 (Heures pleines):

SELECT
  UNIX_TIMESTAMP(date) as time_sec,
  EAS_D1 as value,
  "Heures pleines" as metric
FROM COMPTEUR
WHERE $__timeFilter(date)
ORDER BY date ASC

Et aussi EAS D2 (Nuit) :

SELECT
UNIX_TIMESTAMP(date) as time_sec,
EAS_D2 as value,
"Heures creuses" as metric
FROM COMPTEUR
WHERE $__timeFilter(date)
ORDER BY date ASC

Je vais améliorer les versions patiences …

How to import data of WordPress (Feed RSS) to Joplin ?

Install JOPLIN : https://joplin.cozic.net ,  and start REST API. (Easy)

Step 1 : Put this script in folder.

Step 2 : Edit the script and put your token 

Step 3 : Run the script

The script :

#
# Version 1 
# for Python 3
# 
#   ARIAS Frederic
#   Sorry ... It's difficult for me the python :)
#

import feedparser
from os import listdir
from pathlib import Path
import glob
import csv
import locale
import os
import time
from datetime import datetime
import json
import requests

#Token
ip = "127.0.0.1"
port = "41184"
token = "Put your token here"

nb_import = 0;
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}

url_notes = (
    "http://"+ip+":"+port+"/notes?"
    "token="+token
)
url_folders = (
    "http://"+ip+":"+port+"/folders?"
    "token="+token
)
url_tags = (
    "http://"+ip+":"+port+"/tags?"
    "token="+token
)
url_ressources = (
    "http://"+ip+":"+port+"/ressources?"
    "token="+token
)

#Init
Wordpress_UID = "12345678901234567801234567890123"
UID = {}

payload = {
    "id":Wordpress_UID,
    "title":"Wordpress Import"
}

try:
    resp = requests.post(url_folders, data=json.dumps(payload, separators=(',',':')), headers=headers)
    resp.raise_for_status()
    resp_dict = resp.json()
    print(resp_dict)
    print("My ID")
    print(resp_dict['id'])
    WordPress_UID_real = resp_dict['id']
    save = str(resp_dict['id'])
    UID[Wordpress_UID]= save
except requests.exceptions.HTTPError as e:
    print("Bad HTTP status code:", e)
except requests.exceptions.RequestException as e:
    print("Network error:", e)

feed = feedparser.parse("https://www.cyber-neurones.org/feed/")

feed_title = feed['feed']['title']
feed_entries = feed.entries

numero = -2
nb_entries = 1
nb_metadata_import = 1

while nb_entries > 0 : 
  print ("----- Page ",numero,"-------")
  numero += 2
  url = "https://www.cyber-neurones.org/feed/?paged="+str(numero)
  feed = feedparser.parse(url)
  feed_title = feed['feed']['title']
  feed_entries = feed.entries
  nb_entries = len(feed['entries'])
  for entry in feed.entries:
     nb_metadata_import += 1
     my_title = entry.title
     my_link = entry.link
     article_published_at = entry.published # Unicode string
     article_published_at_parsed = entry.published_parsed # Time object
     article_author = entry.author
     timestamp = time.mktime(entry.published_parsed)*1000
     print("Published at "+article_published_at)
     my_body = entry.description
     payload_note = {
         "parent_id":Wordpress_UID_real,
         "title":my_title,
         "source":"Wordpress",
         "source_url":my_link,
         "order":nb_metadata_import,
         "user_created_time":timestamp,
         "user_updated_time":timestamp,
         "author":article_author,
         "body_html":my_body
         }
     payload_note_put = {
         "source":"Wordpress",
         "order":nb_metadata_import,
         "source_url":my_link,
         "user_created_time":timestamp,
         "user_updated_time":timestamp,
         "author":article_author
         }

     try:
         resp = requests.post(url_notes, json=payload_note)
         resp.raise_for_status()
         resp_dict = resp.json()
         print(resp_dict)
         print(resp_dict['id'])
         myuid= resp_dict['id']
     except requests.exceptions.HTTPError as e:
         print("Bad HTTP status code:", e)
     except requests.exceptions.RequestException as e:
         print("Network error:", e)

     url_notes_put = (
    "http://"+ip+":"+port+"/notes/"+myuid+"?"
    "token="+token
)
     try:
         resp = requests.put(url_notes_put, json=payload_note_put)
         resp.raise_for_status()
         resp_dict = resp.json()
         print(resp_dict)
     except requests.exceptions.HTTPError as e:
         print("Bad HTTP status code:", e)
     except requests.exceptions.RequestException as e:
         print("Network error:", e)

Diaro App : DiaroBackup.xml : How to parse in python ? (Draft n°3)

(See the finale release : https://www.cyber-neurones.org/2019/02/diaro-app-pixel-crater-ltd-diarobackup-xml-how-to-migrate-data-to-joplin/ )

Now with release V3, it’s possible to import data … Le last issue is on user_created_time and user_updated_time.

The REST API is very good ( https://joplin.cozic.net/api/ ) , but If it’s not too complex :

  1. Add possibility to choose the ID on folder.
  2. Add possibility to choose the ID on tags.
  3. Add possibility to do PUT on note to add at the end of text : [](:/ID_RESOURCE). The syntax : PUT /ressources/ID_RESSOURCE/notes/ID_NOTE?token=…”
  4. Possibility to add ID of tags instead text on Notes.

My last source :

#
# Version 3 
# for Python 3
# 
#   ARIAS Frederic
#   Sorry ... It's difficult for me the python :)
#

import xml.etree.ElementTree as etree
from time import gmtime, strftime
import time
import json
import requests
import os

strftime("%Y-%m-%d %H:%M:%S", gmtime())
start = time.time()

#Token
ip = "127.0.0.1"
port = "41184"
token = "ABCD123ABCD123ABCD123ABCD123ABCD123"

nb_import = 0;

url_notes = (
    "http://"+ip+":"+port+"/notes?"
    "token="+token
)
url_folders = (
    "http://"+ip+":"+port+"/folders?"
    "token="+token
)
url_tags = (
    "http://"+ip+":"+port+"/tags?"
    "token="+token
)
url_ressources = (
    "http://"+ip+":"+port+"/ressources?"
    "token="+token
)

#Init
Diaro_UID = "12345678901234567801234567890123"
Lat = {}
Lng = {}
UID = {} 
TAGS = {}
Lat[""] = ""
Lng[""] = ""

payload = {
    "id": Diaro_UID,
    "title": "Diaro Import"
}

try:
    resp = requests.post(url_folders, json=payload)
    #time.sleep(1)
    resp.raise_for_status()
    resp_dict = resp.json()
    print(resp_dict)
    print("My ID")
    print(resp_dict['id'])
    Diaro_UID_real = resp_dict['id']
    save = str(resp_dict['id'])
    UID[Diaro_UID]= save
except requests.exceptions.HTTPError as e:
    print("Bad HTTP status code:", e)
except requests.exceptions.RequestException as e:
    print("Network error:", e)

print("Start : Parse Table")
tree = etree.parse("./DiaroBackup.xml")
for table in tree.iter('table'):
    name = table.attrib.get('name')
    print(name)
    myorder = 1
    for r in table.iter('r'):
         myuid = ""
         mytitle = ""
         mylat = ""
         mylng = ""
         mytags = ""
         mydate = ""
         mydate_ms = 0;
         mytext = ""
         myfilename = ""
         myfolder_uid = Diaro_UID
         mylocation_uid = ""
         myprimary_photo_uid = ""
         myentry_uid = ""
         myorder += 1
         nb_import += 1
         for subelem in r:
             print(subelem.tag)
             if (subelem.tag == 'uid'):
                 myuid = subelem.text
                 print ("myuid",myuid)
             if (subelem.tag == 'entry_uid'):
                 myentry_uid = subelem.text
                 print ("myentry_uid",myentry_uid)
             if (subelem.tag == 'primary_photo_uid'):
                 myprimary_photo_uid = subelem.text
                 print ("myprimary_photo_uid",myprimary_photo_uid)
             if (subelem.tag == 'folder_uid'):
                 myfolder_uid = subelem.text
                 print ("myfolder_uid",myfolder_uid)
             if (subelem.tag == 'location_uid'):
                 mylocation_uid = subelem.text
                 print ("mylocation_uid",mylocation_uid)
             if (subelem.tag == 'date'):
                 mydate = subelem.text
                 mydate_ms = int(mydate)
                 print ("mydate",mydate," in ms",mydate_ms)
             if (subelem.tag == 'title'):
                 mytitle = subelem.text
                 print ("mytitle",mytitle)
             if (subelem.tag == 'lat'):
                 mylat = subelem.text
                 print ("mylat",mylat)
             if (subelem.tag == 'lng'):
                 mylng = subelem.text
                 print ("mylng",mylng)
             if (subelem.tag == 'tags'):
                 mytags = subelem.text
                 if mytags:
                    mytags[1:]
                 print ("mytags",mytags)
             if (subelem.tag == 'text'):
                 mytext = subelem.text
                 print ("mytext",mytext)
                 #if type(mytext) == str:
                       #mytext = mytext.encode('utf8')
             if (subelem.tag == 'filename'):
                 myfilename = subelem.text
                 print ("myfilename",myfilename)
                 
         if (name == 'diaro_folders'):
            payload_folder = {
  "id": myuid,
  "title": mytitle,
  "parent_id": Diaro_UID_real
}
            print(payload_folder)
            try:
                resp = requests.post(url_folders, json=payload_folder)
                resp.raise_for_status()
                resp_dict = resp.json()
                print(resp_dict)
                print(resp_dict['id'])
                save = str(resp_dict['id']) 
                UID[myuid]= save
            except requests.exceptions.HTTPError as e:
                print("Bad HTTP status code:", e)
            except requests.exceptions.RequestException as e:
                print("Network error:", e)

         if (name == 'diaro_tags'):
            payload_tags = {
                "id": myuid,
                "title": mytitle
            }
            try:
                resp = requests.post(url_tags, json=payload_tags)
                #time.sleep(1)
                resp.raise_for_status()
                resp_dict = resp.json()
                print(resp_dict)
                print(resp_dict['id'])
                UID[myuid]= resp_dict['id']
                TAGS[myuid] = mytitle
            except requests.exceptions.HTTPError as e:
                print("Bad HTTP status code:", e)
            except requests.exceptions.RequestException as e:
                print("Network error:", e)

         if (name == 'diaro_attachments'):
            print("Push : "+myfilename)
            filename = "media/photo/" + myfilename
            print("----------0-----------")

            cmd = "curl -F 'data=@"+filename+"' -F 'props={\"title\":\""+myfilename+"\"}' http://"+ip+":"+port+"/resources?token="+token
            resp = os.popen(cmd).read()
            respj = json.loads(resp)
            print(respj['id'])
            UID[myuid]= respj['id']

            print("Link : ",myuid," => ",myentry_uid," // ",UID[myuid]+" => ",UID[myentry_uid])
            time.sleep(1)

            # Not possible : sniff !
            #cmd = "curl -X PUT http://"+ip+":"+port+"/ressources/"+UID[myuid]+"/notes/"+UID[myentry_uid]+"?token="+token
            #resp = os.popen(cmd).read()
            #print (resp)

            url_link = (
               "http://"+ip+":"+port+"/notes/"+UID[myentry_uid]+"?"
               "token="+token
               )
            try:
               resp = requests.get(url_link)
               resp.raise_for_status()
               resp_dict = resp.json()
               print(resp_dict)
               mybody= resp_dict['body']
            except requests.exceptions.HTTPError as e:
               print("Bad HTTP status code:", e)
            except requests.exceptions.RequestException as e:
               print("Network error:", e)

            mybody = mybody + "\n  ![" + myfilename + "](:/" + UID[myuid] + ")   \n";
            payload_note = {
                "body": mybody
            }
            try:
               resp = requests.put(url_link, json=payload_note)
               resp.raise_for_status()
               resp_dict = resp.json()
               print(resp_dict)
            except requests.exceptions.HTTPError as e:
               print("Bad HTTP status code:", e)
            except requests.exceptions.RequestException as e:
               print("Network error:", e)

         if (name == 'diaro_locations'):
              Lat[myuid] = mylat
              Lng[myuid] = mylng

         if (name == 'diaro_entries'):
            if not mytext:
                  mytext = ""
            if not myfolder_uid:
                  myfolder_uid = Diaro_UID
            if not mytags:
                  mytags = ""
            if not mylocation_uid:
                  mylocation_uid = ""
            mytext = mytext.replace("'", "")
            mytitle = mytitle.replace("'", "")
            mytext = mytext.strip("\'")
            mytitle = mytitle.strip("\'")
            mytext = mytext.strip('(')
            mytitle = mytitle.strip('(')
            listtags = mytags.split(",")
            new_tagslist = "";
            for uid_tags in listtags:
                 if (len(uid_tags) > 2):
                        if uid_tags in UID:
                             new_tagslist = new_tagslist + TAGS[uid_tags] + ",";
            print ("TAGS",mytags,"==>",new_tagslist);
            payload_note = {
                "id": myuid,
                "latitude": Lat[mylocation_uid],
                "longitude": Lng[mylocation_uid],
                "tags": new_tagslist,
                "parent_id": UID[myfolder_uid],
                "title": mytitle,
                "source": myuid,
                "order": myorder,
                #"created_time": mydate_ms,
                "user_created_time": mydate_ms,
                "user_updated_time": mydate_ms,
                "author": "Diaro",
                "body": mytext 
            }
            try:
                resp = requests.post(url_notes, json=payload_note)
                #time.sleep(1)
                resp.raise_for_status()
                resp_dict = resp.json()
                print(resp_dict)
                print(resp_dict['id'])
                UID[myuid]= resp_dict['id']
            except requests.exceptions.HTTPError as e:
                print("Bad HTTP status code:", e)
            except requests.exceptions.RequestException as e:
                print("Network error:", e)

print("End : Parse Table")

strftime("%Y-%m-%d %H:%M:%S", gmtime())
done = time.time()
elapsed = done - start
print(elapsed)
print(nb_import)

# END : Ouf ...

Diaro ( diaroapp.com ) : Import from Awesome Note ( bridworks.com )

In my opinion Diaro should do more developments on data imports. Currently, people who change from iPhone to Android are looking for an equivalent of Awesome Note. The problem is nobody import Awesome Note data Backup. In fact before 22/05/2017 it was impossible because the data was encrypted, now it’s totally possible. Each note is a plist file in binary format.

Since the 22/05/2017, it’s more easy to have access to the data with Awesome Note : 

Step n°1 : Export data to backup file ( for exemple : aNote_13Folders_20170520_00_24_21_579Notes.anb.zip )

Step n°2 : Uncompress file : ( for exemple aNote_13Folders_20170520_00_24_21_579Notes.anb )

Step n°3 : Convert all file (*.anote) in folder in XML. This files are .plist ( and not *.anote ) , the command to convert :

plutil -convert xml1 some_file.anote

It’s also possible to reconvert in binary :

plutil -convert binary1 some_other_file.anote

Step n°4 : Task to do … import the data … here a file *.anote :

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
 <key>$archiver</key>
 <string>NSKeyedArchiver</string>
 <key>$objects</key>
 <array>
 <string>$null</string>
 <dict>
 ...
</dict>
 <integer>?</integer>
 <false/>
 <real>?</real>
 <real>?</real>
 <integer>?</integer>
 <string>list of filename</string>
 <string>BASIC_BACKGROUND0004</string>
 <integer>23 ?</integer>
 <string>MarkerFelt-Thin ? </string>
 <string>Text </string>
 <string>Title</string> 
 ....
 <data>
 file 
</data>
 <dict>
...
 </dict>
 <key>$version</key>
 <integer>100000</integer>
</dict>
</plist>

So the goal is to import, the date, the text, the title and the picture (file).

Diaro : Current import :

https://diaro.uservoice.com/knowledgebase/articles/420048-how-to-import-data-from-a-backup-file-evernote : How to import data from a backup file / Evernote / DayOne ?

My previous POST on Diaro and Awesome Note :

Diaro : https://www.diaroapp.com ( Twitter : https://twitter.com/DiaroApp : @DiaroApp 

Awesome Note : 

Question to Diaro :