The Daily Git commands

I always keep googling for few of the git commands that I use on a small interval ( not very frequent ) and end up spending more time then I should be. So instead I’ve compiled a list of Git commands that are helpful in my project development, which might be useful for you as well. I’ll keep updating the list as and when I find something useful.

Clone a git repo ( or Checkout in terms of SVN ):

If it’s a private repo, you need to have your SSH keys added on server ( Github, Bitbucket or GitLab ) or you can use the https url to clone with your login details, for a public repo you need not to have SSH keys or login details, after that you can easily clone a git repo.

Navigate to a folder under which you want to clone the repo, and run the command.

Using SSH (If public keys have been added to server)

git clone [email protected]:UmeshSingla/user-tags.git

Using https

git clone https://github.com/UmeshSingla/user-tags.git

Deleting a remote Git Branch:

git push origin –delete branchName

Delete a remote Tag:

git push origin :tagname

After deleting tag remotely, you need to delete the tag locally as well, otherwise you won’t be able to the re-create the tag

Delete a Tag Locally:

git tag -d tagname

Delete a Folder from git repo:

This will delete the folder only from git repo:

git rm -r --cached folderName

In order to Delete the folder from filesystem and git repo:

git rm -r folderName

If you are getting any warnings you can use ‘f’ for force delete:

git rm -rf folderName

How I fixed my blunder with Node – Mysql DB performance

Thanks to the circumstances, where I was left with no option other than to complete the node script, I was able to learn a bit of node. ( Thanks to Saurabh Shukla for that 😉 )

Initially for the node script we opted for SQLite, considering it is lite and it’ll serve the basic purpose to handle the read and writes we required, a good db performance. But as the no. of concurrent users increased, it totally blew up. My node script stopped responding as it was waiting for database all the time.

What I concluded: Sqlite uses file for storing database, and each time I was making a read or write it locks the whole db slowing down the whole db performance. Thanks to one of my teammate Aaron, he suggested to switch to MySql.

And it seemed to work much better then sqlite, script was running without being choked for multiple simultaneous users.

But the happiness wasn’t long term, because I’m a newbie 😛 when it comes to database administration. As the database size increased it started slowing down again as I haven’t implemented table indexes, and I was relying only on my Primary Key for all SELECT queries.

As a result a single SELECT query would take upto 9s, 20s and so on. So I added indexes for all columns which were used in WHERE clause, but beware, more the number of indexes you use, more it’ll slow down inserts, as mysql updates the whole index after each query.

This kind of sorted out a whole lot of performance issues.

This is where I made a big mistake: Performance issue was resolved upto a big extent, but then I started facing trouble with some particular file names having special characters É, Ã as I was using a character set other then `utf8_unicode_ci`.  I added the ALTER table query to the script and boom another problem solved. Happy Faces all around :)

Later on I increased the buffer pool size after reading about it on http://dba.stackexchange.com/questions/27328/how-large-should-be-mysql-innodb-buffer-pool-size

And the last issue, after my db size grew over 1Gb, whenever I restarted my node request handler script, it’d take atleast 20 minutes, yeah minutes not seconds before mysql would start responding and I had to think before pushing changes, as it’d result in server downtime for that long. And it was all just because I was too careless or a noob :), to leave the Alter table query in the script, yeah.

So everytime I restart the script the whole 2Gb table would be modified, taking half an hour or more, and it’ll keep the tables lock for all that period, resulting into non responding server. And removing those queries solved the whole slowing down on node server restart.

Right now, I’m bit relaxed, unless I encounter another mistake of my own 😉

About Mysqlite,  I think if I have used indexes in sqlite, it would have been scalable enough.

If you are encountering any such issues, you can always check MySql slow query log or if you are using PHPMyAdmin, you can always check the current query being executed and if it is holding the lock over the table, blocking other queries or you can simply login to your mysql console and run the query SHOW FULL PROCESSLIST\G 

Checking the slow queries in PMA

Checking the slow queries in PMA

For updated PMA:

Checking current process list

If you have any other method to tune Mysql database for performance, drop it in comments, I’d love to hear it.

I’m already planning to implement Table compression and partitioning, after I read them thoroughly.

http post request node.js

I have been working in Node.js for past few months, and I’ve experienced that finding the right solution for Node.js takes some time, as relatively, there is less reference out there for Node.js, compared to other programming languages.

So if you are new and working over a Node.js application, you’d probably need to get and post data. There are plenty of references out there demonstrating get request, but very few for post request. To help you out, here is an example of http post request in Node.js

You can easily install the querystring package using npm install querystring. Also don’t forget to create a package.json file for your node application as it helps to setup faster on a new server, if in case you have to move things.

Update:

After working for few days, as the number of GET and POST request increased, it became too tough to continue the script running for few hours. It exits with ‘EMCONNECT’ or ‘EMFILE’, leaving me frustrated with no solution.

After trying the node’s core http package and request library, I had to switch to Hyperquest, which does helps to a large extent.

Post file using wp_remote_post in WordPress

WordPress HTTP API standardize the HTTP requests to allow using a single function instead of providing different method supports based on web hosting.

Instead of directly using curl or any other method, you can use the WordPress HTTP API.

wp_remote_post allows you to send a HTTP post request, but it doesn’t support sending file over to the URL.

For Posting a file using wp_remote_post, you can use the hack given below, It was originally  suggested by David Anderson in WordPress mailing list.

It works perfectly fine, however for bigger file size it might have some issues.

MAMP or the Easyengine Vagrant

Well to be honest, I’ve never even tried using MAMP because I never felt to use it, so I can’t give you Pros or Cons of MAMP.

I’ve recently switched to Vagrant with Easyengine – A good alternative to manual WordPress setup and things are so easy for site development over a Mac that I never felt the need of MAMP. However I’ve been reading the Pros and Cons of Vagrant over MAMP, you can refer it on From MAMP to Vagrant

You can easily setup the Vagrant using their installation guide.

Once you are done settings up the Vagrant and the Easyengine, the whole WordPress site setup takes less than a minute (On a 8Mbps Connection) or few seconds (On Digital Ocean server), as the only time Easyengine takes, is to download the WordPress and rest of the setup is so breezy. It is really handy for WordPress Developers.

Easyengine uses Nginx server and not Apache, which is considered to better in terms of performance. There are lots of site management options for a production server, you can get all the details on their site : https://rtcamp.com/easyengine/

– MAMP or the Easyengine Vagrant

 

JSON response in WordPress

I’ve been working in WordPress for more than a year, developing custom Themes and Plugins.
Ajax is commonly used in each of them. In general, most of us use

 json_encode()

for returning JSON response in WordPress.
Wordpress 3.5 release introduced new functions for sending JSON responses:

wp_send_json()
wp_send_json_success()
wp_send_json_error()

These functions send a JSON response back to an AJAX request, whether successful or error, and die().

Well, you’ be wondering why to use them and I’d say there is no harm in following the WordPress standards. Also, I never knew that I’ve to set the headers too, until now, which I came to know from these functions.
courtesy: Faishal

Automated WordPress installation

If you are a developer, you might need to frequently install WordPress Setups which gets quite boring with the time.

The alternative to manual installation is using some command line tool that could setup the whole thing for you. EasyEngine is one of the command line tools which helps you install your site in 5 3 minutes on Nginx server.

You can disable or enable any site on your server.

It provides you an option to setup site with Nginx Fastcgi Cache, Super Cache or Total Cache Plugin, a multisite and subdomains. Currently EasyEngine support Ubuntu 12.04 LTS and Debian.

Here is the list of commands you can refer to: Easyengine Commands

WordPress comment approval using gmail actions

Gmail recently added email actions to allow its users to take actions directly from their inbox ( good enough for lazy pals like me :) ). These action allows you to review things, provide links(like you tube provides watch video button in gmail), perform other actions.

You can refer Gmail official blog for feature details.

And I’ve used the same for approving WordPress comment directly from email , thank you Incsub for the nice idea! :).

You can check the WordPress plugin: Gmail Comment Approval

Before you start using the plugin, you need to make sure that you fulfill some of the requirements to make it work with Gmail.

Install and activate the plugin

After successful installation, you need to make a dummy comment on your site.
If the comment moderation is turned on, you will receive a comment approval email.

Registering with Google

Register your site email with Gmail to allow action button to appear in the site moderator Gmail address.

  • In order to register with Gmail, you need the sample email you got for comment approval.
  • Forward the email to: [email protected] with proper subject.
  • Copy the email subject and fill up the registration form, you need to add email headers at the form end. In order to get email headers, refer Viewing Message Header in Gmail

SPF/DKIM signed emails

One of the important requirement for Gmail actions is signing of emails using SPF or DKIM, for proper security measures.

Testing email for DKIM/SPF signatures

You can check if your emails are SPF/DKIM signed or not at Mail Tester

If it is not signed properly, you need to add the signatures. If you are using VPS or a dedicated hosting like Digital Ocean you can install it using the tutorial DKIM with Postfix.

For shared hosting you need to contact hosting provider for allowing DKIM signing of emails.

If you are using DKIM signing, make sure you haven’t enabled test mode.

Also you can use 3rd party services like Postmark or Google Apps for mail delivery, they provide proper signing of emails.

As soon as your done with registration, Gmail will show you ‘Approve Comment’ in your email.

In case you are sending email to your own email, you need not to register for it, the action button appears by default.

Hope it saves some time for you.

Happy Coding.

Loading js in shortcode

Shortcode is very commonly used in WordPress sites, and a number of js and styles might be associated with it.

Loading all those shortcode specific js on each and every page will add up unnecessary load time for rest of the pages.

So you can instead load the scripts and styles conditionally.

Register script or style, you want to load dynamically:

add_action( 'wp_enqueue_scripts', 'cc_register_shortcode_script_style' );

/**
 * Register style sheet and scripts
 */
function cc_register_shortcode_script_style() {
	wp_register_style( 'my-shortcode-style', '/path/to/mystyle.css' );
	wp_register_script( 'my-shortcode-script', '/path/to/script.js', array(), '1.2.3', true );
}

And enqueue the style and scripts inside your shortcode , so that whenever the shortocde runs the scripts and styles are available on that page only.

add_shortcode('cc_shortcode', 'cc_shortcode_handler');

function cc_shortcode_handler($atts) {
	wp_enqueue_script('my-shortcode-script');
        wp_enqueue_style('my-shortcode-style');

	// actual shortcode handling here
}
 Ref: WordPress Trac Ticket