Mysql optimization

From MySQL Large Configuration. MySQL Large Original After Optimization Other ideas… mysql optimize innodb_buffer_pool_instances = 2 reduced the number of max connections to 500 (which is still high) syn error adjusting /proc/sys/net/ipv4/tcp_max_syn_backlog nf_conntrack: table full, dropping packet. network and ulimit settings number of connections in time_wait – how to contrl: set both tcp_tw_recycle and lowered tcp_fin_timeout … Read more

Mysql Shortcuts

Linux Shortcuts Find Plesk Password on older plesk versions. Login via ssh as root and run the following command: Show encrypted admin password: Show Plesk MySQL password Show processlist Export Plesk database Import Plesk database # mysql -u admin -p`cat /etc/psa/.psa.shadow` database < /tmp/database.sql Email Commands [bash] /usr/local/psa/admin/bin/mailqueuemng -s tail -50 /usr/local/psa/var/log/maillog tail -50 /var/log/maillog ... Read more

PLESK: How to create multiple Mysql DB’s with a Single User

Login with ssh Login to mysql # mysql -uadmin -p`cat /etc/psa/.psa.shadow` use the mysql db. mysql> use mysql; mysql> SELECT* FROM db; To add same user to another database, you have to insert that user into db table and give him same privileges he already has for his existing database. mysql> INSERT INTO db VALUES(‘localhost’,’second_db’,’same_username_you_used_for_first_db’,’Y’,’Y’,’Y’,’Y’,’Y’,’Y’,’N’,’Y’,’Y’,’Y’,’Y’,’Y’,’Y’,’Y’,’Y’,’Y’,’Y’); … Read more

Mysql Import with phpmyadmin errors – The used command is not allowed with this MySQL version

Trying to LOAD DATA with phpmyadmin #1085 – The file ‘/tmp/phpqcXPYv’ must be in the database directory or be readable by all #1148 – The used command is not allowed with this MySQL version /etc/mysql/my.cnf [mysqld] local-infile=0 If this is set to local-infile=1 – and restart mysql – this should work Also – if that … Read more

Analyze Slow Query log

There is a nice tool like Percona’s ‘pt-query-digest’ tool at http://www.percona.com/doc/percona-toolkit/2.2/pt-query-digest.html, which will automatically parse the slow_query_log and analyze it for the slowest queries. you run the command pt-query-digest slowquerylog.txt Wich should output a summary and a list of the longest-to-execute queries.