CPU : Comment faire la mesure via un script crontab ?

38 x served & 6 x viewed

Je ne sais pas comment faire la mesure via un script sur la crontab … Les données que j’ai sont différentes du CPU History sur Ubuntu.
Actuellement j’utilise :

 "grep 'cpu ' /proc/stat | awk '{usage=($2+$4)*100/($2+$4+$5)" 

mais j’ai aussi essayé avec le découpage dans top :

top -bn1 | grep "Cpu(s)" | \
           sed "s/.*, *\([0-9.]*\)%* id.*/\1/" | \
           awk '{print 100 - $1"%"}'

UPDATE C’est bon j’ai trouvé la bonne commande pour avoir le CPU usage : CPU_USAGE=$(awk ‘{u=$2+$4; t=$2+$4+$5; if (NR==1){u1=u; t1=t;} else print ($2+$4-u1) * 100 / (t-t1) « % »; }’ <(grep ‘cpu ‘ /proc/stat) <(sleep 1;grep ‘cpu ‘ /proc/stat))

Je note aussi cette commande :

$ mpstat 2 1 | awk '$12 ~ /[0-9.]+/ { print 100 - $12"%" }' | head -1
44%

Voici le CPU History :
Voici le Grafana (avec la mauvaise commmande):
Michel, une idée ?

Tuxedo Computer : Script bash pour comprendre pourquoi le bruit du ventilateur

44 x served & 18 x viewed

Voici donc le script que j’ai fait :

#!/bin/bash

#
# MariaDB [mysql]> create database CPU;
# Query OK, 1 row affected (0.00 sec)

# MariaDB [(none)]> CREATE USER 'arias'@'localhost' IDENTIFIED BY 'arias';
# Query OK, 0 rows affected (0.00 sec)

# MariaDB [(none)]> GRANT USAGE ON *.* TO 'arias'@'localhost' IDENTIFIED BY 'arias';
# Query OK, 0 rows affected (0.00 sec)

# MariaDB [(none)]>  GRANT ALL privileges ON CPU.* TO 'arias'@'localhost';
# Query OK, 0 rows affected (0.00 sec)

# MariaDB [(none)]> FLUSH PRIVILEGES;
# Query OK, 0 rows affected (0.00 sec)
#
#
#mysql -u root -e "CREATE DATABASE CPU;"
#mysql -u root -e "USE CPU; CREATE TABLE information (date datetime, cpu float, fanId int, rawFanDuty int, fanDuty float, remoteTemp int, localTemp int);"

CPU_USAGE=$(awk '{u=$2+$4; t=$2+$4+$5; if (NR==1){u1=u; t1=t;} else print ($2+$4-u1) * 100 / (t-t1) "%"; }' <(grep 'cpu ' /proc/stat) <(sleep 1;grep 'cpu ' /proc/stat))
DATE=$(date "+%Y-%m-%d %H:%M:%S")
CPU_USAGE_2=$(echo $CPU_USAGE | sed 's/%//g' )
/usr/bin/tuxedofancontrol --show > /tmp/tuxedofancontrol.out
FANID=$(cat /tmp/tuxedofancontrol.out | grep "fanId" | awk '{print $2}' | sed 's/,//g') 
RAWFANDUTY=$(cat /tmp/tuxedofancontrol.out | grep "rawFanDuty" | awk  '{print $2}' | sed 's/,//g' )
FANDUTY=$(cat /tmp/tuxedofancontrol.out | grep "fanDuty" | awk '{print $2}' | sed 's/,//g')
REMOTETEMP=$(cat /tmp/tuxedofancontrol.out | grep "remoteTemp:" | awk '{print $2}' | sed 's/,//g')
LOCALTEMP=$(cat /tmp/tuxedofancontrol.out | grep "localTemp:" | awk '{print $2}' | sed 's/,//g' )

if [ ! -z $FANID ]
then
        SQL="USE CPU; INSERT INTO information (date, cpu, fanId, rawFanDuty, fanDuty, remoteTemp, localTemp) VALUES ('$DATE',$CPU_USAGE_2,$FANID,$RAWFANDUTY,$FANDUTY,$REMOTETEMP,$LOCALTEMP);"
else
        SQL="USE CPU; INSERT INTO information (date, cpu) VALUES ('$DATE',$CPU_USAGE_2);"
fi

echo $SQL > /tmp/lastsql.out
mysql -u root -e "$SQL"

C’est assez simple, je prends la date et l’utilisation du CPU. Et aussi je note toutes les informations de tuxedofancontrol.
J’ai mis le script toutes les 2 minutes dans ma crontab, ensuite je ferai un script pour Grafana.

$ sudo crontab -l
*/2 * * * * /home/arias/Bash/cpu.bash

Par exemple :

A suivre.

Visiblement je ne suis pas le seul à trouver le ventilateur un peu trop bruyant : https://www.tuxedocomputers.com/en/Infos/Help-Support/Frequently-asked-questions/Why-is-the-ventilator-so-loud-.tuxedo .

Why is the ventilator so loud?

IMPORTANT: There are ventilation slots on the bottom side through which the ventilator(s) suck in air. Never place your notebook on a pillow, couch, bed, blanket or tablecloth. In the worst case scenario, the notebook can overheat and cause serious damage.

The ventilator noise depends largely on the system load. If the processor has to work a lot, it consumes a lot of power, which leads to increased waste heat that somehow has to be cooled. If the ventilation slots are covered, the ventilator(s) cannot suck in any air. This has a negative effect on the cooling performance, the ventilator(s) must rotate faster to provide sufficient cooling and the noise level is significantly higher.

Rechercher les doubons d’images de facon rapide

22 x served & 3 x viewed

Avant d’utiliser Digikam, il est plus facile de supprimer les images identiques via un simple script :

find Images/  -type f -exec md5sum '{}' ';' | sort | uniq --all-repeated=separate -w 15 > dupes.txt

awk '/^$/{getline;print;}' dupes.txt | awk '{print $2 " " $3 " " $4}' | xargs gvfs-trash {}

Il est aussi possible d’utiliser :

fdupes -rSm Images

L’option -d permet la suppression.

Ensuite on peut utiliser Digikam, qui lui permet de reconnaitre des images identiques mais n’ayant pas la même taille.

Thunderbird mbox to ( Influxdb, Postgresql, mysql ) to Grafana in Python

35 x served & 10 x viewed

J’ai amélorié ( voir : http://www.cyber-neurones.org/2020/03/thunderbird-mbox-to-influxdb-and-postgresql-to-grafana-in-python/ ) le programme afin d’injecter sur MySQL ( MariaDB en vérité ). Le plus facile a manipuler sur Grafana c’est MariaDB.

Pour se connecter de Grafana à MariaDB :

Les requêtes SQL pour Grafana :

Par jours :

SELECT
UNIX_TIMESTAMP(date) AS time_sec,
domain as ‘metric’,
count(domain) as value
FROM thunderbird
WHERE
$__timeFilter(date)
GROUP BY DAY(date),MONTH(date),YEAR(date)
ORDER BY date

Par mois :

SELECT
UNIX_TIMESTAMP(date) AS time_sec,
domain as ‘metric’,
count(domain) as value
FROM thunderbird
WHERE
$__timeFilter(date)
GROUP BY MONTH(date),YEAR(date)
ORDER BY date

Par années :

SELECT
UNIX_TIMESTAMP(date) AS time_sec,
domain as ‘metric’,
count(domain) as value
FROM thunderbird
WHERE
$__timeFilter(date)
GROUP BY YEAR(date)
ORDER BY date

Les sources du programme :

( Source sur : https://github.com/farias06/Python/blob/master/parse_email_v2.py )

#! /usr/bin/env python3
# ~*~ utf-8 ~*~

# Readme :
# ARIAS FREDERIC 
# 
# influx user create -n arias -p arias -o cyberneurones-org 
#

import mailbox
import bs4
import glob
import os
import time
import codecs
import sys

#
#
from influxdb import InfluxDBClient
import re
from datetime import datetime

#
#
from email.utils import parsedate_to_datetime

#
#
import logging

#
#  pip3 install psycopg2
import psycopg2

#  pip3 install mysql-connector-python-rf
#  pip3 install mysql

# MariaDB [mysql]> create database thunderbird;
# Query OK, 1 row affected (0.00 sec)

# MariaDB [(none)]> CREATE USER 'arias'@'localhost' IDENTIFIED BY 'arias';
# Query OK, 0 rows affected (0.00 sec)

# MariaDB [(none)]> GRANT USAGE ON *.* TO 'arias'@'localhost' IDENTIFIED BY 'arias';
# Query OK, 0 rows affected (0.00 sec)

# MariaDB [(none)]>  GRANT ALL privileges ON thunderbird.* TO 'arias'@'localhost';
# Query OK, 0 rows affected (0.00 sec)

# MariaDB [(none)]> FLUSH PRIVILEGES;
# Query OK, 0 rows affected (0.00 sec)

import mysql.connector as mariadb

#########################

logger = logging.Logger('catch_all')

#########################

global nb_folder
nb_folder = 0;
global nb_email
nb_email = 0;
global nb_error
nb_error = 0;
global id_email
id_email = 0;

global flag_influxdb
flag_influxdb = False

global flag_postgresql
flag_postgresql = False

global flag_mysql
flag_mysql = True

global name_Table
name_DB = 'thunderbird'
name_Table = 'thunderbird'
my_login = 'arias'
my_password = 'arias'
my_host = '127.0.0.1'
Login = 'arias';
Folder = 'zy3zk9ms.default';

global client
if (flag_influxdb == True):
   client = InfluxDBClient(host=my_host, port=8086, username=my_login, password=my_password)
   client.drop_database(name_Table)
   client.create_database(name_Table)
   client.switch_database(name_Table)

global client2
if (flag_postgresql == True):
   client2 = psycopg2.connect("dbname="+name_DB+" user="+my_login+" password='"+my_password+"'")
   cursor2 = client2.cursor()
   sqlCreateTable = "create table "+name_Table+" (id bigint UNIQUE, mail varchar(128), name varchar(128), domain varchar (128), date timestamp);"
   cursor2.execute(sqlCreateTable)
   sqlCreateTable = "delete from "+name_Table;
   cursor2.execute(sqlCreateTable)
   client2.commit()

global client3
if (flag_mysql == True):
   client3 = mariadb.connect(user=my_login, password=my_password, database=name_DB)
   cursor3 = client3.cursor()
   #sqlCreateTable = "create table "+name_Table+" (id bigint UNIQUE, mail varchar(128), name varchar(128), domain varchar (128), date datetime);"
   #cursor3.execute(sqlCreateTable)
   sqlCreateTable = "delete from "+name_Table;
   cursor3.execute(sqlCreateTable)
   client3.commit()

#########################

def get_html_text(html):
    try:
        return bs4.BeautifulSoup(html, 'lxml').body.get_text(' ', strip=True)
    except AttributeError: # message contents empty
        return None

class GmailMboxMessage():
    def __init__(self, email_data):
        if not isinstance(email_data, mailbox.mboxMessage):
            raise TypeError('Variable must be type mailbox.mboxMessage')
        self.email_data = email_data

    def parse_email(self):
        global client
        global client2
        global id_email
        global name_Table
        global cursor2
        global cursor3
        global flag_influxdb
        global flag_postgresql
        global flag_mysql
        email_date = self.email_data['Date']
        email_from = self.email_data['From']
        email_to = self.email_data['To']
        email_subject = self.email_data['Subject']
        if email_date is not None and email_from is not None:
            mail = re.search(r'[\w\.\-_]+@[\w\.\-_]+', email_from)
            if mail is not None:
                mailstr = mail.group(0)
            if mail is not None:    
                domain = re.search("@[\w\.\-_]+", email_from).group(0)
                domain = domain.replace('@', '')
                domain = domain.replace('>', '')
            if mail is not None:
                user = re.search("[\w\.i\-_]+@", email_from).group(0)
                user = user.replace('@', '')
                user = user.replace('<', '')
            local_time_str = datetime.fromtimestamp(parsedate_to_datetime(email_date).timestamp()).strftime('%Y-%m-%dT%H:%M:%S.%f%z')
            local_time_str2 = datetime.fromtimestamp(parsedate_to_datetime(email_date).timestamp()).strftime('%Y-%m-%d %H:%M:%S')
            timestamp = round(parsedate_to_datetime(email_date).timestamp() * 1000);
            if mail is not None:
                data = [{'measurement': 'thunderbirds', 'tags': { 'fullemail': 1, 'from': email_from, 'mail': mailstr, 'domain': domain, 'user': user}, 'id' : id_email, 'time': timestamp, 'date':local_time_str, 'fields': {"value": 1}}]
                sql = "INSERT INTO "+name_Table+" (mail, domain, name, id, date) VALUES ('" +mailstr+ "','" + domain+"','" +user+"','"+str(id_email)+"',TIMESTAMP '"+local_time_str2+"')";
                sql2 = "INSERT INTO "+name_Table+" (mail, domain, name, id, date) VALUES ('" +mailstr+ "','" + domain+"','" +user+"','"+str(id_email)+"','"+local_time_str2+"')";
                #print (sql2)
                if (flag_postgresql == True):
                   cursor2.execute(sql);
                   client2.commit();
                if (flag_mysql == True):
                   cursor3.execute(sql2);
                   client3.commit();
            else :
               data = [{'measurement': 'thunderbirds', 'tags': { 'fullemail': 0, 'from': email_from }, 'id' : id_email, 'time': timestamp, 'date':local_time_str, 'fields': {"value": 1}}] 
            #print (data);
            if (flag_influxdb == True):
               client.write_points(data, time_precision='ms')
            id_email = id_email+1

def mbox_reader(stream):
    data = stream.read()
    text = data.decode(encoding="utf-8")
    return mailbox.mboxMessage(text)

######################### End of library, example of use below

print("\nUsing glob.iglob()") 
for filename in glob.iglob('/home/'+Login+'/snap/thunderbird/common/.thunderbird/'+Folder+'/Mail/Local Folders/**/*', recursive=True): 
    print(filename);
    filename2, file_extension = os.path.splitext(filename);
    print(file_extension + " " + str(len(file_extension)));
    isFile = os.path.isfile(filename)
    if (file_extension != ".msf") and (file_extension != ".sbd") and isFile is True:
        mbox_obj = mailbox.mbox(filename);
        num_entries = len(mbox_obj)
        nb_folder = nb_folder + 1;
        try :
           for idx, email_obj in enumerate(mbox_obj):
               email_data = GmailMboxMessage(email_obj)
               email_data.parse_email()
               nb_email = nb_email + 1;
               print('Parsing email {0} of {1}'.format(idx, num_entries))
        except StopIteration:
           continue
        except Exception as e: 
           logger.error('Failed : '+ str(e))
           nb_error = nb_error+1;
           continue

print('The number of folder :'+str(nb_folder));    
print('The number of email :'+str(nb_email));
print('The number of error : '+str(nb_error));
print('The number in database : '+str(id_email));

Thunderbird mbox to Influxdb and Postgresql to Grafana in Python

38 x served & 21 x viewed

J’ai donc fait un programme en python afin de faire un export des données de Thunderbird ( fichier mbox contenant les emails) vers Influxdb et Postgresql (dans un premier temps).

Le but du programme est de maitriser le python afin de faire ensuite des exports vers Grafana. Sur mes exemples j’ai mis en login/password arias/arias (je sais c’est pas secure mais c’est un exemple …).

Configuration GRAFANA:

Le programme :

( Source disponible ici : https://github.com/farias06/Python/blob/master/parse_email.py )

#! /usr/bin/env python3
# ~*~ utf-8 ~*~

# Readme :
# influx user create -n arias -p arias -o cyberneurones-org 

import mailbox
import bs4
import glob
import os
import time
import codecs
import sys
from influxdb import InfluxDBClient
import re
from datetime import datetime
from email.utils import parsedate_to_datetime
import logging
import psycopg2

logger = logging.Logger('catch_all')

#########################

global nb_folder
nb_folder = 0;
global nb_email
nb_email = 0;
global nb_error
nb_error = 0;
global id_email
id_email = 0;

global client
client = InfluxDBClient(host='127.0.0.1', port=8086, username='arias', password='arias')
client.drop_database('thunderbird')
client.create_database('thunderbird')
client.switch_database('thunderbird')

global client2
client2   = psycopg2.connect("dbname=thunderbird user=arias password='arias'")
cursor                = client2.cursor()
global name_Table
name_Table            = "thunderbird"
#sqlCreateTable = "create table "+name_Table+" (id bigint UNIQUE, mail varchar(128), name varchar(128), domain varchar (128), date timestamp);"
sqlCreateTable = "delete from "+name_Table;
cursor.execute(sqlCreateTable)
client2.commit()

#########################

def get_html_text(html):
    try:
        return bs4.BeautifulSoup(html, 'lxml').body.get_text(' ', strip=True)
    except AttributeError: # message contents empty
        return None

class GmailMboxMessage():
    def __init__(self, email_data):
        if not isinstance(email_data, mailbox.mboxMessage):
            raise TypeError('Variable must be type mailbox.mboxMessage')
        self.email_data = email_data

    def parse_email(self):
        global client
        global client2
        global id_email
        global name_Table
        global cursor
        email_date = self.email_data['Date']
        email_from = self.email_data['From']
        email_to = self.email_data['To']
        email_subject = self.email_data['Subject']
        if email_date is not None and email_from is not None:
            mail = re.search(r'[\w\.\-_]+@[\w\.\-_]+', email_from)
            if mail is not None:
                mailstr = mail.group(0)
            if mail is not None:    
                domain = re.search("@[\w\.\-_]+", email_from).group(0)
                domain = domain.replace('@', '')
                domain = domain.replace('>', '')
            if mail is not None:
                user = re.search("[\w\.i\-_]+@", email_from).group(0)
                user = user.replace('@', '')
                user = user.replace('<', '')
            local_time_str = datetime.fromtimestamp(parsedate_to_datetime(email_date).timestamp()).strftime('%Y-%m-%dT%H:%M:%S.%f%z')
            local_time_str2 = datetime.fromtimestamp(parsedate_to_datetime(email_date).timestamp()).strftime('%Y-%m-%d %H:%M:%S')
            timestamp = round(parsedate_to_datetime(email_date).timestamp() * 1000);
            if mail is not None:
                data = [{'measurement': 'thunderbirds', 'tags': { 'fullemail': 1, 'from': email_from, 'mail': mailstr, 'domain': domain, 'user': user}, 'id' : id_email, 'time': timestamp, 'date':local_time_str, 'fields': {"value": 1}}]
                sql = "INSERT INTO "+name_Table+" (mail, domain, name, id, date) VALUES ('" +mailstr+ "','" + domain+"','" +user+"','"+str(id_email)+"',TIMESTAMP '"+local_time_str2+"')";
                #print (sql)
                cursor.execute(sql);
                client2.commit();
            else :
               data = [{'measurement': 'thunderbirds', 'tags': { 'fullemail': 0, 'from': email_from }, 'id' : id_email, 'time': timestamp, 'date':local_time_str, 'fields': {"value": 1}}] 
            #print (data);
            client.write_points(data, time_precision='ms')
            id_email = id_email+1

def mbox_reader(stream):
    data = stream.read()
    text = data.decode(encoding="utf-8")
    return mailbox.mboxMessage(text)

######################### End of library, example of use below

print("\nUsing glob.iglob()") 
for filename in glob.iglob('/home/ZZZZZZZ/snap/thunderbird/common/.thunderbird/ZZZZZZZZZ.default/Mail/Local Folders/**/*', recursive=True): 
    print(filename);
    filename2, file_extension = os.path.splitext(filename);
    print(file_extension + " " + str(len(file_extension)));
    isFile = os.path.isfile(filename)
    if (file_extension != ".msf") and (file_extension != ".sbd") and isFile is True:
        mbox_obj = mailbox.mbox(filename);
        num_entries = len(mbox_obj)
        nb_folder = nb_folder + 1;
        try :
           for idx, email_obj in enumerate(mbox_obj):
               email_data = GmailMboxMessage(email_obj)
               email_data.parse_email()
               nb_email = nb_email + 1;
               print('Parsing email {0} of {1}'.format(idx, num_entries))
        except StopIteration:
           continue
        except Exception as e: 
           logger.error('Failed : '+ str(e))
           nb_error = nb_error+1;
           continue

print('The number of folder :'+str(nb_folder));    
print('The number of email :'+str(nb_email));
print('The number of error : '+str(nb_error));
print('The number in database : '+str(id_email));

L’execution du programme:

...
The number of folder :1321
The number of email :120384
The number of error : 125
The number in database : 113523

Oui j’ai plus de 113523 emails ( environ 30 Go) … misère. Pour information seul les softs open source sont stables avec autant d’email, sous Outlook c’est le crash.

Le résultat :