Apple iPhone 13 series and ongoing display issue after ios update

iPhone 13 display issue is an ongoing issue for more than a year now particularly caused by ios updates. And it’s affecting not just me in India but a thousands of Apple consumers across the globe. And Apple is doing nothing about it, they’re just sitting silently on multiple reports they’ve been receiving daily.

Twitter, Facebook, YouTube, Instagram is all filled with such complaints and still there has been no statement from Apple, not sure what they are waiting for. Maybe a change.org campaign 😀 or a fresh lawsuit from a particular state?

After spending so much money on a Apple device the least a customer expects is reliability after all it’s one of the most premium phones. And if for some reason a particular series is affected the expectation would be acceptability from Apple. But in case of iPhone 13 series Apple has neither shown reliability nor the acceptability for their mistake.

Here is the series of events that happened with my device:

After installing the latest ios update, I immediately noticed lines on my display which got worse within a day leading to big vertical lines. These vertical lines rendered the display completely unusable.

I reached out to customer support and they suggested basic diagnosis which was of no help. The customer care executive further told me that it’s a known issue with iPhone 13 series caused by ios update and they’re waiting for Apple to launch a program. I was told to visit the service center for a physical inspection.

That gave me some hope but upon checking further I saw year old threads with the same issue which made me realize it’s an old issue but Apple has done nothing about it.

At service center the executive checked the device and confirmed there is no physical damage. Upon checking further he said it’s going to be a paid repair since my device is out of warranty.

Upon discussing with him further he told me that it’s a known issue again, happening because of ios updates.

If everyone at Apple knows it’s a knows issue in Apple iPhone 13 series, why is Apple not doing anything about it? Or they just want to make money until they can and then launch a program just for the sake of it.

This issue would have been acceptable if only a few devices were affected but in this scenario there are thousands of users complaining about it. This definitely points to a Quality issue from Apple side and they should pay for their mistake and not the consumer.

OnePlus phones were facing similar issue and they accepted the mistake and atleast offering a screen replacement for free. If OnePlus can listen to their customers why can’t World’s largest technology company do it.

If you’ve been affected by iPhone 13 display issue, make sure that you raise your voice. Be it a twitter post, facebook post, or a post on Apple forums just add it.

#AppleBrokeiphone13 #KindlyFix

Linking some of the active complaints over the Internet.

https://discussions.apple.com/thread/254796670 – More than 300 user marked on the single thread that they are facing the exactly same issue.

ACF – Order Custom Posts with datefield

To order custom post types with an ACF field named start_date and display posts with dates set in the future, this code proved effective for me.

The custom field start_date is part of the group field date_range and it’s stored as data_range_start_date in post meta by Advanced Custom Fields plugin.

array(
		'post_type'      => 'custom_post_type',
		'posts_per_page' => 15,
		'post_status'    => 'publish',
		'fields'         => 'ids',
		'meta_key'       => 'date_range_start_date',
		'orderby'        => array( 'meta_value' => 'ASC' ),
		'meta_query'     => array(
			'key'     => 'date_range_start_date',
			'value'   => date( 'Ymd' ),
			'type'    => 'DATE',
			'compare' => '>='
		),
	);

Auto Resize WP Editor Custom instance

WordPress supports the auto resizing of Post Editor, It’s helpful when your post content grows longer so that you don’t have to keep scrolling. You can use the same Auto resize WP Editor instance in a metabox or a widget.

If you’re using a custom instance of WP Editor with the help of wp_editor() function, you just need to pass a few additional arguments to enable this neat feature.

Here is the code to add a instance of WP Editor in a metabox with auto resizing turned on.

/* Define the custom box */
add_action( ‘add_meta_boxes’, ‘my_meta_box’ );

/* Do something with the data entered */
add_action( ‘save_post’, ‘custom_save_post_meta’ );

/* Adds a box to the main column on the Post and Page edit screens */
function my_meta_box() {
add_meta_box( ‘wp_editor_metabox’, ‘Metabox with Editor’, ‘wp_editor_meta_box’ );
}

/* Prints the box content */
function wp_editor_meta_box( $post ) {

// Use nonce for verification
wp_nonce_field( plugin_basename( __FILE__ ), ‘metabox_nonce’ );

$field_value = get_post_meta( $post->ID, ‘custom_meta’, true );
//Add a resizable editor, with a minimum height of 100 px
$args = array(
‘tinymce’ => array(
‘autoresize_min_height’ => 100,
‘wp_autoresize_on’ => true,
‘plugins’ => ‘wpautoresize’,
‘toolbar1’ => ‘bold,italic,underline,link,unlink,forecolor’,
‘toolbar2’ => ”,
),
);
wp_editor( $field_value, ‘custom_meta’, $args );
}

/* When the post is saved, saves our custom data */
function custom_save_post_meta( $post_id ) {

// verify if this is an auto save routine.
// If it is our form has not been submitted, so we dont want to do anything
if ( defined( ‘DOING_AUTOSAVE’ ) && DOING_AUTOSAVE )
return;

// verify this came from the our screen and with proper authorization,
// because save_post can be triggered at other times
if ( ( isset ( $_POST[‘metabox_nonce’] ) ) && ( ! wp_verify_nonce( $_POST[‘metabox_nonce’], plugin_basename( __FILE__ ) ) ) )
return;

// Check permissions
if ( ( isset ( $_POST[‘post_type’] ) ) && ( ‘page’ == $_POST[‘post_type’] ) ) {
if ( ! current_user_can( ‘edit_page’, $post_id ) ) {
return;
}
}
else {
if ( ! current_user_can( ‘edit_post’, $post_id ) ) {
return;
}
}

// OK, we’re authenticated: we need to find and save the data
if ( isset ( $_POST[‘custom_meta’] ) ) {
update_post_meta( $post_id, ‘custom_meta’, $_POST[‘custom_meta’] );
}

}

WSE Answer related to this: https://wordpress.stackexchange.com/questions/284112/auto-resize-when-writing-with-custom-post-wp-editor/284465#284465

ASUS RT AC55UHP AC1200 Router Review

Revised Verdict – 6/10

They have a very poor firmware no matter how good is the hardware specially when it comes to compatibility with Chromecast. It gets so irritating that I had to revise the review. Their Customer support is not very helpful specially in India.

Verdict – 8/10

If you’re looking for a new router, you should definitely go for a AC  Wireless standard router and ASUS RT AC55UHP is a  good choice.  It fits right into your budget if you need a good reliable wifi connection in a medium/large size apartment. It offers very good speed and a stable range over a large area across multiple walls.

With the 2 9dbi high power antennas it works flawlessly within a area of around 900 sqft ( Tested ) and penetrates well enough across 3 concrete walls on 2.4 GHz band. Internet speed over 5GHz wifi band is amazing considering the drop in speed is only 22% against LAN port, which easily varies from 30-50% for other expensive routers.

The highest speed you get on 5GHz channel is around 70 Mbps over wifi against the 92 Mbps over LAN/Ethernet port and the Upload speed goes upto 93 Mbps, which is same over ethernet port.

Cons: Over 2-3 day usage, nothing so far, just that the signal drop is significant If I put the router in a bit of corner, which is definitely same with all the routers because of interference.

Detailed Review:

I had to replace my old Netgear WNDR3400v3 – N600 Dual Band router, as I got a new 100 Mbps connection. The main reason for the upgrade was the poor speed and wifi range, I was ready to adjust a bit with speed, but the low wifi range made it totally unusable.

I tried Netgear WN3000RP-200INS wifi extender, but it was a total let down with multiple devices ( 5 ) connected to it, in terms of speed and range. After a lot of research I came down to 2 routers:

Asus AC1900 RT AC68U ( INR 13.5K, $208 approx.) and
Netgear R7000-100INS Nighthawk AC1900 ( INR 12k, $185 approx.)

but both of them were bit too costly.

ASUS RT AC55UHP, was recommended by a friend, who was using a similar model.

WIFI Range

Let’s come down to the coverage map of the router, I’ve tested it over a area of around 900 sq ft.

WIFI coverage map in my apartment which is 750 sq ft including the adjacent apartment, which makes the total area of around 900 sq ft *Image credits at the end

For my old Netgear router, the wifi range was quite low just across the first wall, Asus outperformed with a good wifi range across 3 concrete walls offering a speed of upto 20 Mbps on 2.4 GHz, where 3rd wall being thicker in comparison to other two walls.

Here is a screenshot, comparing the wifi performance of 5Ghz band in same room, across first wall and across 2 walls.

5GHz Band: For 5Ghz band, wifi range drops significantly after two walls, but still I was able to get a download/upload speed of upto 40 Mbps.

2GHz Band: Wifi Range is quite good across 3 walls, and drops significantly after 4th wall.

Speed Test

LAN

ASUS RT AC55UHP offers a good speed over LAN port, I was getting a maximum of 87 Mbps over my existing router and Asus outperformed with a speed of 92.80/93.68 Mbps with a ping time of 2ms.

Speed test results over Ethernet

WIFI

With ref to the coverage map, here are the speed test results for 5GHz band across same room, single wall and multiple walls.

Here are the speedtest results for 2.4 GHz band, the drop in Upload speed is because of the ISP and not the router:

Looking at the screenshots, speed over 5GHz in same room went upto 70 Mbps / 93 Mbps, and it dropped with each wall being introduced between the router and the device. With upto 2 walls, the Internet is reliable.

The lower band 2.4 GHz, offers a good range over speed, and the drop in speed is adjustable, considering, I was getting a 20Mbps speed in region 4 i.e across 3 walls.

* Seq. 1,2,3,4 in screenshots refers to the Speed Tests areas marked in wifi coverage map.

Router Placement

Make sure to keep the router in an open area and not a complete corner, and it should be 3-4 ft. above the ground, to get a better range. Because the signal drops rapidly if the router is in a complete corner. Antennas should be pointing in different directions.

That sums up the wifi coverage and speed of the router.

Inside the Box

You’ve your Asus Router, 2 9Dbi long Antennas, Ethernet cable, Power Adapter, support for Antennas to align them properly, Installation CD.

You won’t need the installation CD, unless you’ve to troubleshoot and re-install the firmware for some reason.

 

Router, Power Adapter, Ethernet Cable, 2 Dbi Antennas, Antenna Support/Aligner, Installation CD

Setup

Setting up this router is fairly easy, it automatically checks for type of Internet connection and prompts you for ISP login details, if required. You’ll need to update the firmware as soon as you get the Router, the latest one is 3.0.0.4.380.4180 released on December 29th, 2016.

 

Hardware

Asus RT-AC55UHP have  a Qualcomm chipset, with 128 MB Flash and 128 MB RAM.

For further ref: https://wikidevi.com/wiki/ASUS_RT-AC55UHP

Custom Firmware

As of now there is no custom firmware available for Asus RT-AC55UHP, but OpenWrt does have a build for Asus RT-AC55U

Apart from that, there a lot of other features like AiCloud, Firewall, support for VPN and a USB 3.0 port. I haven’t played with any of those yet.

Just make sure to register your product at Asus online, to avail the warranty.

* Coverage map image is designed by Ganesh Kerkar

Guetzli Installation and Performance report

Guetzli, is a recently Open Sourced JPEG encoder from google. I gave it a whirl to test out the performance/timing for image compression. I’ve tried out PNG->JPG, and JPG only compression with the default quality.

Installation on Ubuntu:

Follow the instruction given on Github Repo:

  • Clone/Download the source code from Github Repo .
  • Install libpng and gflags, using apt-get install libpng-dev libgflags-dev.
  • Run make and expect the binary to be created in bin/Release/guetzli
  • The binary gets created in same directory inside bin/Release/
  • Run vim ~/.bashrc and go to the end of file, add export PATH=~/guetzli/bin/Release:$PATH  and save it, where guetzli is your root folder name.

Performance:

I’ve tested out 5 of each PNG and JPEGs for the conversion on a Digital Ocean single core Ubuntu droplet with 500MB RAM.

PNG Optimisation:

Image Link                      Original Size    Optimised Size  Time

bees.png                          177KB                    37KB              0 m 16.372 s
( Orig | Optimised )

Old_Jewish_man.png*  49KB                    195KB             16 m 5.397 s
( Orig | Optimised )

Render_Homer_8.png   218KB                  146KB             0 m  59.611 s
( Orig | Optimised )

Winter-Is-Coming.png  1.3 MB                 286KB             2 m  26.712 s
( Orig | Optimised )

Himalayas.png               546KB                265KB               4 m 52.428 s
( Orig | Optimised )

JPEG optimisation:

Image Link                                  Original Size    Optimised Size   Time

100_5478.jpg                            189KB                   168KB              2 m 14.152 s
( Orig | Optimised )

Eiffel_Tower_Black_White.jpg  250KB               132KB              0 m 45.584 s
( Orig | Optimised )

taj-mahal-H.jpeg                    473KB                    153KB            2 m 44.823 s
( Orig | Optimised )

Leopard Wallpaper.jpg            699KB                    596KB           11 m 35.742 s
( Orig | Optimised )

Kangaroo.jpg                          270KB                      237KB           3 m 42.517 s
( Orig | Optimised )

* For transparent PNGs, Guetzli automatically inserts a black background while converting them to JPGs, and the image size went 4x higher from 49KB to 195KB and it took 16 minutes for the conversion.

PNG to JPG optimisation is pretty damn good for images without alpha channel aka transparency. Though the time taken for a 177KB file size is 16s, which is way too slow considering if your site have got as low as 500 images, and out of which most of them would be at least above 500KB.

Same goes for JPG to JPG optimisation, a few images are reduced to half, where others got a savings of up to 100KBs. Though the time spent on optimisation is still very high for all of them.

Images for comparison:

PNG to JPG

bees.png original image
Original Image

 

bees.jpg optimised image
Optimised Image

JPG to JPG

taj-mahal-H.jpeg original image
Original Image

 

taj-mahal-H-optimised.jpeg optimised image
Optimised Image

Guetzli is quite time/resource consuming to be implemented in production. I was not able to optimise a 4MB image, as the process was killed every time.

The Daily Git commands

I always keep googling for few of the git commands that I use on a small interval ( not very frequent ) and end up spending more time then I should be. So instead I’ve compiled a list of Git commands that are helpful in my project development, which might be useful for you as well. I’ll keep updating the list as and when I find something useful.

Clone a git repo ( or Checkout in terms of SVN ):

If it’s a private repo, you need to have your SSH keys added on server ( Github, Bitbucket or GitLab ) or you can use the https url to clone with your login details, for a public repo you need not to have SSH keys or login details, after that you can easily clone a git repo.

Navigate to a folder under which you want to clone the repo, and run the command.

Using SSH (If public keys have been added to server)

git clone [email protected]:UmeshSingla/user-tags.git

Using https

git clone https://github.com/UmeshSingla/user-tags.git

Deleting a remote Git Branch:

git push origin –delete branchName

Delete a remote Tag:

git push origin :tagname

After deleting tag remotely, you need to delete the tag locally as well, otherwise you won’t be able to the re-create the tag

Delete a Tag Locally:

git tag -d tagname

Delete a Folder from git repo:

This will delete the folder only from git repo:

git rm -r --cached folderName

In order to Delete the folder from filesystem and git repo:

git rm -r folderName

If you are getting any warnings you can use ‘f’ for force delete:

git rm -rf folderName

How I fixed my blunder with Node – Mysql DB performance

Thanks to the circumstances, where I was left with no option other than to complete the node script, I was able to learn a bit of node.

Initially for the node script we opted for SQLite, considering it is lite and it’ll serve the basic purpose to handle the read and writes we required, a good db performance. But as the no. of concurrent users increased, it totally blew up. My node script stopped responding as it was waiting for database all the time.

What I concluded: Sqlite uses file for storing database, and each time I was making a read or write it locks the whole db slowing down the whole db performance. Thanks to one of my teammate Aaron, he suggested to switch to MySql.

And it seemed to work much better then sqlite, script was running without being choked for multiple simultaneous users.

But the happiness wasn’t long term, because I’m a newbie 😛 when it comes to database administration. As the database size increased it started slowing down again as I haven’t implemented table indexes, and I was relying only on my Primary Key for all SELECT queries.

As a result a single SELECT query would take upto 9s, 20s and so on. So I added indexes for all columns which were used in WHERE clause, but beware, more the number of indexes you use, more it’ll slow down inserts, as mysql updates the whole index after each query.

This kind of sorted out a whole lot of performance issues.

This is where I made a big mistake: Performance issue was resolved upto a big extent, but then I started facing trouble with some particular file names having special characters É, Ã as I was using a character set other then `utf8_unicode_ci`.  I added the ALTER table query to the script and boom another problem solved. Happy Faces all around 🙂

Later on I increased the buffer pool size after reading about it on http://dba.stackexchange.com/questions/27328/how-large-should-be-mysql-innodb-buffer-pool-size

And the last issue, after my db size grew over 1Gb, whenever I restarted my node request handler script, it’d take atleast 20 minutes, yeah minutes not seconds before mysql would start responding and I had to think before pushing changes, as it’d result in server downtime for that long. And it was all just because I was too careless or a noob :), to leave the Alter table query in the script, yeah.

So everytime I restart the script the whole 2Gb table would be modified, taking half an hour or more, and it’ll keep the tables lock for all that period, resulting into non responding server. And removing those queries solved the whole slowing down on node server restart.

Right now, I’m bit relaxed, unless I encounter another mistake of my own 😉

About Mysqlite,  I think if I have used indexes in sqlite, it would have been scalable enough.

If you are encountering any such issues, you can always check MySql slow query log or if you are using PHPMyAdmin, you can always check the current query being executed and if it is holding the lock over the table, blocking other queries or you can simply login to your mysql console and run the query SHOW FULL PROCESSLIST\G 

Checking the slow queries in PMA
Checking the slow queries in PMA

For updated PMA:

Checking current process list

If you have any other method to tune Mysql database for performance, drop it in comments, I’d love to hear it.

I’m already planning to implement Table compression and partitioning, after I read them thoroughly.

http post request node.js

I have been working in Node.js for past few months, and I’ve experienced that finding the right solution for Node.js takes some time, as relatively, there is less reference out there for Node.js, compared to other programming languages.

So if you are new and working over a Node.js application, you’d probably need to get and post data. There are plenty of references out there demonstrating get request, but very few for post request. To help you out, here is an example of http post request in Node.js

You can easily install the querystring package using npm install querystring. Also don’t forget to create a package.json file for your node application as it helps to setup faster on a new server, if in case you have to move things.

Update:

After working for few days, as the number of GET and POST request increased, it became too tough to continue the script running for few hours. It exits with ‘EMCONNECT’ or ‘EMFILE’, leaving me frustrated with no solution.

After trying the node’s core http package and request library, I had to switch to Hyperquest, which does helps to a large extent.

Post file using wp_remote_post in WordPress

WordPress HTTP API standardize the HTTP requests to allow using a single function instead of providing different method supports based on web hosting.

Instead of directly using curl or any other method, you can use the WordPress HTTP API.

wp_remote_post allows you to send a HTTP post request, but it doesn’t support sending file over to the URL.

For Posting a file using wp_remote_post, you can use the hack given below, It was originally  suggested by David Anderson in WordPress mailing list.

It works perfectly fine, however for bigger file size it might have some issues.

MAMP or the Easyengine Vagrant

Well to be honest, I’ve never even tried using MAMP because I never felt to use it, so I can’t give you Pros or Cons of MAMP.

I’ve recently switched to Vagrant with Easyengine – A good alternative to manual WordPress setup and things are so easy for site development over a Mac that I never felt the need of MAMP. However I’ve been reading the Pros and Cons of Vagrant over MAMP, you can refer it on From MAMP to Vagrant

You can easily setup the Vagrant using their installation guide.

Once you are done settings up the Vagrant and the Easyengine, the whole WordPress site setup takes less than a minute (On a 8Mbps Connection) or few seconds (On Digital Ocean server), as the only time Easyengine takes, is to download the WordPress and rest of the setup is so breezy. It is really handy for WordPress Developers.

Easyengine uses Nginx server and not Apache, which is considered to better in terms of performance. There are lots of site management options for a production server, you can get all the details on their site : https://rtcamp.com/easyengine/

– MAMP or the Easyengine Vagrant