Global warming e inquinamento in genere

2020.01.10 – CO2 concentration, 285 ppbv in Minerbe (Italy), not great not terrible

L’ecologia mi sta a cuore, il pericolo che corriamo è concreto. Sensibilizzare le masse su tale pericolo è fondamentale e lottare perché si trovi una soluzione è quantomeno doveroso. È giusto però puntualizzare alcune cose.

È facile andare in piazza e gridare cosa non ci stia bene; no nucleare, no trivelle, no global warming, no esperimenti sulla fusione, no OGM, no impianti che rovinano i paesaggi, no polveri sottili, no traffico, not TAV, no autostrade, no auto elettriche perché le batterie inquinano e l’energia per produrla ha generato inquinamento (perché le auto a diesel o benzina, che fanno?), no pannelli fotovoltaici perché tra 20 anni saranno da smaltire, viva la natura! Che i potenti risolvano il global warming, per Dio, il pianeta è uno solo e non si scherza. 

È facile fare appelli collettivi; quando però si tratta di rinunciare noi a qualcosa, di ragionare nell’ottica del singolo, tutti i nodi vengono al pettine. 

La nostra vita è fatta d’energia, che non si crea per grazia divina; non parlo di spiritualità, ma di crudi bilanci in Joule ed una manciata (più o meno) di formule fisiche che tengono in piedi l’universo.

Il cibo che portiamo in tavola comporta un dispendio di risorse e si, anche i cibi vegani, dopotutto al supermercato mica ci arrivano da soli. I nostri spostamenti sono un dispendio di risorse (e di tempo! Non dimenticate mai di considerare il tempo come una risorsa). I nostri dispositivi elettronici da cui dipendiamo sono un dispendio di risorse, pensate a quanto intasate i server con i meme di Messi che si guarda spaesato che spedite o quando consultate i profili degli influencer, è sempre energia che viene consumata. La nostra casa sempre calda, o sempre fresca, o sempre illuminata domanda in continuazione risorse. Questo dispendio, questo benessere e crescita economica ha arrecato danni al nostro pianeta e continua a farlo ogni singolo secondo, giorno dopo giorno, mese dopo mese, anno dopo anno.

Sperate di soddisfare questi bisogni da un giorno all’altro con sola energia pulita – in tempi sufficientemente brevi da scongiurare il disastro? Non succederà. È brutale, ma è così. Non é tecnicamente possibile, è utopico. La nostra crisi ambientale si risolverà solo con compromessi da parte di tutti, non basterà sottostare agli accordi di Parigi. Non fatevi illusioni, non esistono bacchette magiche; potete manifestare quanto volete, ma se davvero credete in quello che fate, preparatevi a cambiare. 

Credetemi: quando decidemmo di rinunciare democraticamente al nucleare, nessuno di noi avrebbe però rinunciato alla luce in casa. (considero la scelta di rinunciare al nucleare figlia della follia collettiva e dell’incapacità dei nostri governati di prendere una decisione importante senza guardare al mero consenso elettorale). 

Quando affermate di voler interrompere il global warming, pensate di poter allo stesso tempo rinunciare alla vostra carne, alla vostra frutta esotica, alle vostre 4 auto per famiglia, al riscaldamento sempre acceso e al resto delle vostre comodità? 

O produciamo energia, o non avremo crescita e benessere e torneremo all’età della pietra. O consumiamo risorse in modo proporzionale alla capacità della produzione energetica pulita, o sporcheremo. O smettiamo di sporcare (o comunque limitiamo in modo deciso il processo), o comprometteremo il pianeta.

Decidete voi dove inserire il compromesso in questo loop.

(Post riarrangiato dal sottoscritto a partire da un post preso da Internet di cui non ricordo la fonte ma che ringrazio infinitamente per la saggezza e la semplicità con cui ha espresso questi concetti [se mi leggi, palesati!]. Btw, no, non è di Greta Thunberg)

[AdSense-A]

Import SQL file using pgAdmin

I have a docker Postgres image and I want to import the data from another Postgres db. The first thing I have done is to create a pg_dump on the remote server and then I have tried to import it. The problem is that output generated is simple SQL file and, if I import this file on pgAdmin I get an error:

pg_restore: [archiver] input file appears to be a text format dump. Please use psql.

psql is not installed on my Mac because I am running it as a docker image and the file exported is using COPY to import values instead of INSERT.
The solution I have found is to export the db using –column-inserts flag:

$ pg_dump --column-inserts -U user db_test > db_test.2020-01-02_insert.sql

–column-inserts will dump as insert commands with column names.

Tagged

Install MySQL server with Docker Compose

Inside a new directory, create a data directory and docker-compose.yml with these rows:

version: '3'

services: 
  db:
    container_name: docker-local-mysql
    image: mysql:5.7.21
    volumes:
      - "./data:/var/lib/mysql"
    restart: always
    ports:
      - 3306:3306
    environment:
      MYSQL_ROOT_PASSWORD: password

To start the container, run docker-compose up -d. To stop & remove the container, run docker-compose down.

To connect from a MySql client use this:

Host: 127.0.0.1
Username: root
Password: password
Port: 3306
Tagged ,

Warning: setlocale: LC_CTYPE: cannot change locale (UTF-8): No such file or directory

After a recent Ubuntu update, I get this error. What finally helped was putting to the file /etc/environment:

LC_ALL=en_US.UTF-8
LANG=en_US.UTF-8

For some reason, it was missing. The outputs for locale and other commands appeared like the variables were properly defined. In other words, don’t take for granted all the basic stuff is declared where it should be declared.

Tagged

How to fix a locale setting warning from Perl and/or *nix bash?

Perl is sometime called by a *nix script and, after an update I get recently this warning message (in particular when I run apt or npm command):

perl: warning: Setting locale failed.
perl: warning: Please check that your locale settings:
	LANGUAGE = (unset),
	LC_ALL = (unset),
	LC_CTYPE = "UTF-8",
	LANG = "en_US.UTF-8"
    are supported and installed on your system.
perl: warning: Falling back to a fallback locale ("en_US.UTF-8").
locale: Cannot set LC_CTYPE to default locale: No such file or directory
locale: Cannot set LC_ALL to default locale: No such file or directory

This can be fixed simply adding the following lines to your bashrc or bash_profile on the host machine:

export LC_CTYPE=en_US.UTF-8
export LC_ALL=en_US.UTF-8
Tagged ,

Find on different collections to create a document using mongoose

I need to query multiple collections to prepare a MongoDB document and then save it using Mongoose and NodeJS. The solution is to use async.parallel
I have two source collections, Robot, Target and the destination collection Activity.

For first, async must be added:

var async = require('async');

then:

async.parallel({
 
    robotFind: function(cb) { Robot.find({ "_id": jsonContent.robotId }).exec(cb); },
    targetFind: function(cb) { Target.find({ "_id": jsonContent.targetId }).exec(cb); }
 
}, function(err, result) {
 
    activity.robot = result.robotFind[0];
    activity.action = result.actionFind[0];
    activity.target = result.targetFind[0];
 
    activity.execution_date = jsonContent.execution_date;
    activity.alert = jsonContent.alert;
    activity.result = executionResult;
 
    activity.description = jsonContent.description;
 
    activity.save(function(err) {
        if (err) {
            console.log('[postActivity] ' + err)
            res.status(500).json({ error: err.message })
        } else {
            console.log('[postActivity] Saved!')
            res.status(200).json({ message: activity })
        }
    })
}

So, first part queries the MongoDB and fill the object result. Second part consumes result object, create the new document and save it.

PS: if you like it please, click on the banner :)
[AdSense-B]

Tagged , ,

How to use Spring DataSource bean as data source for Log4j 2 JDBC appender

I would like to log log4j2 messages into a relational database using the datasource defined on application context and initialized using spring using log4j 2.10.

One possibility is to add a JDBC appender inside log4j2 xml configuration but, Log4j is initialized before Spring so, dataSource won’t be available at runtime so the only solution is to add an appender programmatically. Of course, it is possible to use a log4j2 jdbc appender but, this approach allow to have the possibility to use spring.properties file to override the spring application context with proper environment settings in according to my profile.

This is the datasource defined on application context xml:

<!--  ############ SQLSERVER DATABASE SECTION ############ -->
<bean id="dataSourceMSSqlServer" class="org.springframework.jdbc.datasource.DriverManagerDataSource">
    <property name="driverClassName" value="com.microsoft.sqlserver.jdbc.SQLServerDriver" />
    <property name="url" value="jdbc:sqlserver://${sqlserver.hostname};databaseName=${sqlserver.database};" />
    <property name="username" value="${sqlserver.user}" />
    <property name="password" value="${sqlserver.pass}" />
</bean>

this allow me to configure different database for every environment.
This is the table on which I want to log the entries:

[Id] [INT] IDENTITY(1,1) NOT NULL,
[CreatedTimeStamp] [datetimeoffset](7) NOT NULL,
[Level] [INT] NOT NULL,
[SOURCE] [nvarchar](MAX) NULL,
[Message] [nvarchar](MAX) NULL,
[Content] [nvarchar](MAX) NULL,
[ProductName] [nvarchar](MAX) NULL,
[Version] [nvarchar](MAX) NULL,
[LogType] [INT] NOT NULL DEFAULT ((0)),
[AuditEventType] [INT] NULL,
[UserId] [nvarchar](128) NULL,

The plan is to create a spring bean, inject the DataSource bean, and add JDBC Appender configuration dynamically in @PostConstruct method.

package com.afm.web.utility;
 
import java.sql.Connection;
import java.sql.SQLException;
 
import javax.annotation.PostConstruct;
import javax.sql.DataSource;
 
import org.apache.logging.log4j.Level;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.core.Appender;
import org.apache.logging.log4j.core.LoggerContext;
import org.apache.logging.log4j.core.appender.db.ColumnMapping;
import org.apache.logging.log4j.core.appender.db.jdbc.ColumnConfig;
import org.apache.logging.log4j.core.appender.db.jdbc.ConnectionSource;
import org.apache.logging.log4j.core.appender.db.jdbc.JdbcAppender;
import org.apache.logging.log4j.core.config.AppenderRef;
import org.apache.logging.log4j.core.config.Configuration;
import org.apache.logging.log4j.core.config.LoggerConfig;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;
 
@Component
public class JDBCLog {
 
    @Autowired
    private DataSource dataSourceMSSqlServer;
 
    // Inner class
    class Connect implements ConnectionSource {
 
	private DataSource dsource;
 
	public Connect(DataSource dsource) {
	    this.dsource = dsource;
	}
 
	@Override
	public Connection getConnection() throws SQLException {
	    return this.dsource.getConnection();
	}
 
    }
 
    public JDBCLog() {}
 
    @PostConstruct
    private void init(){
 
	System.out.println("####### JDBCLog init() ########");      
	final LoggerContext ctx = (LoggerContext) LogManager.getContext(false); 
	final Configuration config = ctx.getConfiguration();
 
	// Here I define the columns I want to log. 
	ColumnConfig[] columnConfigs = new ColumnConfig[] {
	    ColumnConfig.newBuilder()
                .setName("CreatedTimeStamp")
                .setPattern(null)
                .setLiteral(null)
                .setEventTimestamp(true)
                .setUnicode(false)
                .setClob(false).build(),
	    ColumnConfig.newBuilder()
                .setName("Source")
                .setPattern("%K{className}")
                .setLiteral(null)
                .setEventTimestamp(false)
                .setUnicode(false)
                .setClob(false).build(),
	    ColumnConfig.newBuilder()
                .setName("Level")
                .setPattern("%level")
                .setLiteral(null)
                .setEventTimestamp(false)
                .setUnicode(false)
                .setClob(false).build(),
	    ColumnConfig.newBuilder()
                .setName("Message")
                .setPattern("%K{message}")
                .setLiteral(null)
                .setEventTimestamp(false)
                .setUnicode(false)
                .setClob(false).build(),
	    ColumnConfig.newBuilder()
                .setName("Content")
                .setPattern("%K{exception}")
                .setLiteral(null)
                .setEventTimestamp(false)
                .setUnicode(false)
                .setClob(false).build(),
	    ColumnConfig.newBuilder()
                .setName("ProductName")
                .setPattern(null)
                .setLiteral("'DHC'")
                .setEventTimestamp(false)
                .setUnicode(false)
                .setClob(false).build(),
	    ColumnConfig.newBuilder()
                .setName("Version")
                .setPattern(null)
                .setLiteral("'1.0'")
                .setEventTimestamp(false)
                .setUnicode(false)
                .setClob(false).build(),
	    ColumnConfig.newBuilder()
                .setName("AuditEventType")
                .setPattern("%K{eventId}")
                .setLiteral(null)
                .setEventTimestamp(false)
                .setUnicode(false)
                .setClob(false).build(),
	    ColumnConfig.newBuilder()
                .setName("UserId"
                .setPattern("%K{userId}")
                .setLiteral(null)
                .setEventTimestamp(false)
                .setUnicode(false)
                .setClob(false).build(),
	    ColumnConfig.newBuilder()
                .setName("LogType")
                .setPattern("%K{logType}")
                .setLiteral(null)
                .setEventTimestamp(false)
                .setUnicode(false)
                .setClob(false).build()
        };
 
	Appender jdbcAppender = JdbcAppender.newBuilder()
		.setBufferSize(0)
                .setColumnConfigs(columnConfigs)
                .setColumnMappings(new ColumnMapping[]{})
                .setConnectionSource(new Connect(dataSourceMSSqlServer))
                .setTableName("dhc.LogItems")
                .withName("databaseAppender")
                .withIgnoreExceptions(true)
                .withFilter(null)
                .build();
 
	jdbcAppender.start();
	config.addAppender(jdbcAppender);
 
	// Create an Appender reference.
	// @param ref The name of the Appender.
	// @param level The Level to filter against.
	// @param filter The filter(s) to use.
	// @return The name of the Appender.
	AppenderRef ref= AppenderRef.createAppenderRef("JDBC_Appender", null, null);
        AppenderRef[] refs = new AppenderRef[] {ref};
 
        /*
         * Factory method to create a LoggerConfig.
         *
         * @param additivity true if additive, false otherwise.
         * @param level The Level to be associated with the Logger.
         * @param loggerName The name of the Logger.
         * @param includeLocation whether location should be passed downstream
         * @param refs An array of Appender names.
         * @param properties Properties to pass to the Logger.
         * @param config The Configuration.
         * @param filter A Filter.
         * @return A new LoggerConfig.
         * @since 2.6
         */
        LoggerConfig loggerConfig = LoggerConfig.createLogger(
                false, Level.DEBUG, "JDBC_Logger", null, refs, null, config, null);        
        loggerConfig.addAppender(jdbcAppender, null, null);
 
        config.addLogger("JDBC_Logger", loggerConfig);       
        ctx.updateLoggers();  
 
        System.out.println("####### JDBCLog init() - DONE ########");  
 
    }
 
    public DataSource getDataSource() {
	return dataSourceMSSqlServer;
    }
 
    public void setDataSource(DataSource dataSourceMSSqlServer) {
	this.dataSourceMSSqlServer = dataSourceMSSqlServer;
    }	  
 
}

At this point, is possible to call the logger from the code in this way:

Logger jdbcLogger = LogManager.getContext(false).getLogger("JDBC_Logger"); 
jdbcLogger.info(new StringMapMessage()
    .with("eventId", AuditEventType.Logger_General.toString())
    .with("exception", "")
    .with("userId", "TESTUSER")
    .with("message", "TEST!!")
    .with("className", 
        this.getClass().getPackage().toString().replaceAll("package ", "") 
        + "." + this.getClass().getSimpleName() 
        + "." + new Object() {}.getClass().getEnclosingMethod().getName())
);

If you are using Log4j in a Servlet 2.5 web application, or if you have disabled auto-initialization with the isLog4jAutoInitializationDisabled context parameter, you must configure the Log4jServletContextListener and Log4jServletFilter in the deployment descriptor or programmatically. The filter should match all requests of any type. The listener should be the very first listener defined in your application, and the filter should be the very first filter defined and mapped in your application. This is easily accomplished using the following web.xml code:

<listener>
    <listener-class>
        org.apache.logging.log4j.web.Log4jServletContextListener
    </listener-class>
</listener>
 
<filter>
    <filter-name>log4jServletFilter</filter-name>
    <filter-class>
        org.apache.logging.log4j.web.Log4jServletFilter
    </filter-class>
</filter>
<filter-mapping>
    <filter-name>log4jServletFilter</filter-name>
    <url-pattern>/*</url-pattern>
    <dispatcher>REQUEST</dispatcher>
    <dispatcher>FORWARD</dispatcher>
    <dispatcher>INCLUDE</dispatcher>
    <dispatcher>ERROR</dispatcher>
    <!-- 
        Servlet 3.0 w/ disabled auto-initialization only; 
        not supported in 2.5 
    -->
    <dispatcher>ASYNC</dispatcher>
</filter-mapping>

This dependency must be added to pom.xml:

<dependency>
    <groupId>org.apache.logging.log4j</groupId>
    <artifactId>log4j-web</artifactId>
    <version>2.10.0</version>
</dependency>

Of course, if you are not already included, add it to the component-scan into Spring application-context:

<context:component-scan base-package="com.afm.web.utility" />

PS: if you like it please, click on the banner :)

[AdSense-A]

Tagged , ,

Warning about SSL connection when connecting to MySQL database

After a recent update of mySql, I get this warning:

WARN: Establishing SSL connection without server's identity verification is not recommended. 
According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established 
by default if explicit option isn't set. For compliance with existing applications not using SSL 
the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL 
by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.

To disable disable SSL and also suppress the SSL warning, it’s possible to set to false the useSSL parameter on the connection string:

jdbc:mysql://localhost:3306/myDb?autoReconnect=true&useSSL=false

On applicationContext, something like this:

<!--  ############ MY SQL DATABASE SECTION ############ -->
<bean id="dataSource" class="org.springframework.jdbc.datasource.DriverManagerDataSource">	
    <property name="driverClassName"><value>com.mysql.jdbc.Driver</value></property>
    <property name="url">
        <value>jdbc:mysql://${mysql.hostname}:${mysql.port}/${mysql.db}?useSSL=false</value>
    </property>
    <property name="username"><value>${mysql.user}</value></property>
    <property name="password"><value>${mysql.password}</value></property>
</bean>
Tagged ,

Convert timestamp long to normal date format

One simply way to convert a Long time stamp into a formatted string is (time paramter is Long timestamp):

Date date = new Date(time);
Format format = new SimpleDateFormat("yyyy MM dd HH:mm:ss");
return format.format(date);

These packages must be included.

import java.sql.Date;
import java.text.Format;
import java.text.SimpleDateFormat;

Date and Time Patterns

Date and time formats are specified by date and time pattern strings. Within date and time pattern strings, unquoted letters from 'A' to 'Z' and from 'a' to 'z' are interpreted as pattern letters representing the components of a date or time string. Text can be quoted using single quotes (') to avoid interpretation. "''" represents a single quote.
All other characters are not interpreted; they’re simply copied into the output string during formatting or matched against the input string during parsing.

The following pattern letters are defined (all other characters from 'A' to 'Z' and from 'a' to 'z' are reserved):

Letter Date or Time Component Presentation Examples
G Era designator Text AD
y Year Year 1996; 96
M Month in year Month July; Jul; 07
w Week in year Number 27
W Week in month Number 2
D Day in year Number 189
d Day in month Number 10
F Day of week in month Number 2
E Day in week Text Tuesday; Tue
a Am/pm marker Text PM
H Hour in day (0-23) Number 0
k Hour in day (1-24) Number 24
K Hour in am/pm (0-11) Number 0
h Hour in am/pm (1-12) Number 12
m Minute in hour Number 30
s Second in minute Number 55
S Millisecond Number 978
z Time zone General time zone Pacific Standard Time; PST; GMT-08:00
Z Time zone RFC 822 time zone -0800
Tagged

Date based query using milliseconds (Java long) time on MongoDB

Let’s suppose I need to search all records that match a date condition. On MongoDB I’ve a bunch of data like this:

{
    "_id" : "9ed3b937-0f43-4613-bd58-cb739a8c5bf6",
    "userModels" : {
        "5080" : {
            "generated_date_timestamp" : NumberLong(1413382499442),
            "model_id" : 5080,
        },
    }
    "values" : {}
}

This is the query:

db.anonProfile.find({ 
   "userModels.5080.generated_date_timestamp" : { 
      "$gte" : ISODate("2013-10-01T00:00:00.000Z").getTime() 
   }
});
.getTime()

allows to translate ISODate into a NumberLong timestamp.

Tagged