Tutorial #24: Tweaking the Asus Tuf Gaming A15


Previously, in my review of the laptop, the only tweaking I had undertaken was an auto-overclock of the GPU which, as per expectation, yielded a performance improvement of around 6% overall with only a slight loss in CPU performance, purely on the basis of the additional available thermal headroom.


During that time I had left the CPU untouched because AMD does not officially support tweaking on laptops and Ryzen Controller did not work for me then. However, later I came across Renoir Mobile Tuning and found it to be operational for this laptop, albeit with a few bugs. I switched to Ryzen Controller again and found that it too now worked well for Renoir with the additional benefit of applying the setting automatically on boot.

With a CPU tuning tool in place, the next thing was determining what to do with it. While these tools often end up as overclocking utilities, my intention couldn't be further opposite to that. The idea was to effectively underclock the system without losing performance i.e. to reduce the temperatures while still maintaining a performance boost over the stock settings.

To cut a long story short, I played around various combination of settings to finally settle on one that seems to work the best. Not that it an exhaustive analysis but rather the most practical among the ones I had tried. Note that I only experimented with the Boost TDPs and the temperature limits. The boost duration seemed pretty logical and I did not want to introduce yet another variable that muddied up the testing. Eventually this resulted in the following changes:
  • Temperature Limit: 90
  • Long Boost TDP: 54 
  • Short Boost TDP: 50
For reference, the default temperature limit is 95 with long and short boost TDPs of 60 and 54 respectively. Also, I auto-overclocked the GPU again to make the most of any benefit available from reduction of the CPU performance. So, how did this theoretical reduction in CPU performance impact the benchmark scores for Fire Strike and Time Spy compared to the ones from the review?


As expected, this has quite an impact on the CPU performance as it has dropped by nearly 5% but on the other hand the graphics score has jumped by 1% resulting in an overall gain of 0.7% on Fire Strike, taking it past 16,000 for the first time. However, the result for Time Spy was more interesting as there was a minor loss instead overall indicating the underclock has more of an impact of DX12 than it does on DX11, which is probably not unexpected. Note that this is an indication of the gain over the gain already achieved by overclocking the GPU originally, so overall the incremental gain is still worth it.

Lastly, the laptop has a secret weapon up its sleeve. Until now, all the tests were conducted using the default Performance mode. However, there is also a Turbo mode which sets the fans whirring to possibly the maximum setting under full load. Yes, it boosts up the scores even further. Below I have again attached a comparison of the Turbo mode performance for the stock CPU settings in comparison to the underclocked one and it is quite the same as earlier. While the DX11 performance is higher with the underclock, it is lower by an equal proportion in case of DX12. 

It has to be kept in mind though that apart from the scores, the underclock has an additional benefit in reducing the overall temperatures and also prolonging the life of the components. Also with the combination of the 4800H with the RTX2060, it is the latter that is going to hit the limit rather easily compared to the former, so a sacrifice of CPU performance for a GPU gain makes a lot more sense.

Finally, I leave you with a comparison of the current profile comprising of a GPU Overclock and CPU Underclock on Turbo with the stock GPU and CPU settings.

A jump of 7.8% on DX11 and 6.6% on DX12 with lower overall temperatures to boot is nothing shoddy. Seems something called as free lunch does exist after all.

Tutorial #23: Taming the beast (Dell XPS 7590 Core i9)


One of the significant purchases I made over the past 6 months is the Dell XPS 7590 with Intel's Coffee Lake Core i9-9980HK, Samsung's 32 GB DDR4-2667 RAM, Toshiba's 1 TB SSD, Nvidia's GTX 1650 and the crème de la crème that is the 4K OLED panel made by Samsung. But before you get any ideas, this is not a device that I would have otherwise purchased but for the fact that I found a single piece listed on Amazon 2 days prior to its official launch of the device at a price that was lesser than the 2019 Acer Helios 300. The risk was worth it as it came sealed with a valid 12-month Premium Plus support from Dell. There are instances in one's life where one doesn't mind getting lucky and this was certainly one of those.

Normally, I would be prompt in reviewing devices within the first few weeks of purchase. However, in this case I think I am too biased towards the device to perhaps put up a worthwhile review. Hence, I thought it better to post a tutorial that would be of some assistance to fellow users. One thing that I am certain of is that the hardware has outgrown the XPS chassis design over the years and the Core i9 pushes things a bit too far in terms of what the chassis is capable of handling thermally. Hence, I went on an optimisation quest with the intention of lowering the temperatures and increasing the overall efficiency of the device. I will own up to the fact that I don't intend to use the device on battery at all unless I am forced to but for that eventuality I decided to find a compromise which would at least provide stock performance at lower battery consumption as against higher performance when operating directly on AC.

The tool of choice in this case for the CPU is Throttlestop which offers significantly more tweaking potential than Intel's Extreme Tuning utility. As for the GPU, the mainstream tool to use is MSI Afterburner. However, in case of this GPU, I found that the temperature limit setting on MSI AB was locked for some reason even after unlocking all the advanced options and the Auto Overclocker resulted in far too frequent game crashes. Hence, I instead went ahead with Asus GPU Tweak II which allowed the GPU temperature target to be set upfront. By default, this is set to 75 Celsius and I instead bumped it to the stock value of GTX 1650 which is 87 Celsius. However, the idea in general is to still not exceed 75 Celsius during most strenuous tasks but to provide the headroom to exceed that if needed.

With this background, in the interest of time, I have decided to simply post the screenshots of the various screens from the tools since further elaboration on each parameter can be found on their respective forums. In case of the GPU, I eventually stuck with simply pushing up the clocks by 10% as undervolting using the frequency curve resulted in far too many instability issues. Is this the most optimum setting possible, most probably not. However, I believe this is the best setting I could identify with trial and error, as attested by the 88 unexpected reboots on record. I could certainly push the clocks and voltages quite a bit more but in general it led to instability and I am certainly no fan of BSODs. Another point to note is that while Asus GPU Tweak II can be set to start on reboot, Throttlestop requires additional effort in setting up the task scheduler which is what I have indicated below.

Starting Throttlestop on Login:

Throttlestop settings for AC profile:

Throttlestop settings for Battery profile:

Now to focus on the fruits of the labour or the pudding so to say. I am not a fan of benchmarks in general but in this case, I needed something to comparatively measure the impact of the changes and a few basic benchmarks provide the easiest reference in this case. Note that I ran all the benchmarks with only the discrete GPU enabled with the overclock settings, so it represents the worst possible scenario in case of thermals.

UserBenchmark:
This might not be the first benchmark utility that springs to mind but for the fact that it allows comparative analysis for similar hardware components and is of considerably short duration. In this case, the CPU came up at 97th percentile and the GPU at 100% percentile which, considering the fact that is mostly going up against much bulkier gaming laptops with much better thermals, is noteworthy. Overall, the CPU efficiency is excellent with the tweaks providing higher performance at lower power. The discrete GPU however doesn't scale up in terms of efficiency and while it is possible to get more performance out of it, it comes at a significant cost in terms of power and heat.

Cinebench:
Cinebench really pushes the CPU and is thus a good test of its ultimate performance. A sequence of 2 consecutive runs also pushes the CPU to its thermal limits. Not surprisingly then, the 1st run score of 3684 is more than 20% better than stock and even the 2nd consecutive run scores better than the stock settings with lower average temperatures.

Heaven:
This benchmark was run at the Extreme preset. As I have already mentioned previously, pushing the GPU doesn't really yield huge benefits in this constrained form factor as the any performance benefits come with equally higher power consumption and heat generation. However, as can be seen in the results, a 3% performance boost in Heaven comes with lower CPU temperatures and the GPU power consumption is lower even though it hasn't been undervolted. So, a win-win overall.

Lastly, how do these modifications fare with a modern game. I happen to have Hitman 2 installed at present, so I thought I'd give it a go with the in-built benchmarks which I frankly didn't find to be entirely consistent across different runs. But I believe it should give at least give an idea of what the laptop is now capable of, even though it is not meant to be a gaming laptop.
I set all the details to the maximum possible apart from lowering it a notch to 'High' for 'Level of Detail', 'SSAO' and 'Shadow Quality', besides turning 'Motion Blur' to 'Off'. The Mumbai benchmark produced a score of 70.95 FPS with CPU averaging at 79C and the GPU at 70C. The more demanding Miami benchmark chewed out 54.04 FPS with CPU/GPU temperatures averaging at 78C/69C respectively. A more than serviceable gaming machine if I may say so.

Tutorial #22: Optimal performance from the Samsung Galaxy A50 (or any mid-range device)


The demands from the hardware have arisen significantly with every passing year, which is only made worse by manufacturer-specific UIs adding an extra layer of cruft. While hardware capabilities increase demonstrably every year, the software demands more than negate the gains and ensure that even last year's flagship is not a safe bet anymore. However, not everyone needs the latest flagship device or wants to spend a small fortune for the extra processing power.

As I touched upon previously, my primary reason for getting the A50 was the large OLED screen. With gaming on the mobile out of the picture, all I really wanted was to not have a horrible user experience which becomes part and parcel of any mid-range device over time. Mi devices are most offending in that respect with MIUI but Samsung hasn't won itself any honors by bundling lots of promoted apps, some uninstallable, coupled with a Samsung Pay Mini card that interferes with the gesture system.

While adb commands offer a fair degree of control over the device, I prefer to root the device when possible to be able to customise it just that bit better with lesser hassles. The Samsung A50 has perhaps the unintended benefit of being able to boot into the rooted as well as the unrooted system at any point of time which kind of ensure the best of both worlds, if you are not looking to use rooted apps all the time.

With this, I present a step-by-step guide to setting up the device to run like it does when brand new, only better because of the uninstallation of all the bloatware. While it wouldn't make any of games run any faster than what the hardware allows it to, what it does is ensure that the phone is running optimally at any point of time, so no memory-hogs or sudden slow-downs or battery-drains.

1. Rooting the device (optional)

First the disclaimer. Rooting the A50 trips the Knox bit, so you are immediately foregoing device warranty as well as the ability to use any Knox-secured apps like Secure folder and Samsung Pass, though you can still run some Samsung apps like Pay Mini and Health.

For this, I will simply point you to John Wu's excellent tutorial. It has worked with every firmware released till date and allows you to upgrade to every new release while retaining your data, the downside being that you will have to download the entire firmware to do so as OTAs are no-go.

Also, as I mentioned previously, the peculiar partitioning and button combination allows one to boot in to either the rooted or the unrooted system. I personally prefer optimising the system in root mode but don't run it as daily driver as it has issues with WiFi disconnections and random reboots. However, the changes are carried over just fine to the unrooted system which is rock stable and has not rebooted randomly on me till date.

2. Installing Island and making it device admin

Island makes use of the 'Android for Work' feature to create a separate work profile for which it, and consequently you, are the admin. It can be made the device admin without root access provided you delete all other user accounts and make it the admin using adb commands. There is also the option of God Mode which essentially allows Island to control the Mainland apps.

3. Installing Greenify

However, Island by itself doesn't have a background service and it utilises Greenify for that purpose, unsurprisingly from the same developer. While Greenify can normally hibernate apps using Android Doze, the integration with Island takes it to the next level.

4. Deep Hibernation
The easiest way to ensure that apps undergo deep hibernation is to select the 'Auto-freeze with Greenify' option from within Island. This directly adds the app to the "Hibernation list" in Greenify with the 'Deep Hibernation (by Island)' option enabled. Alternately, one can manually add the app within Greenify and then select the same hibernation option.

5. Create 'Unfreeze' shortcut
Subsequent to selecting an app for Deep Hibernation within Island as mentioned in the previous step, it is a good idea to immediately select the 'Create Unfreeze & Launch Shortcut' option which does what it says. It allows you to directly launch the hibernated app but requires you to maintain the shortcut on the homescreen, iOS-style.

6. Create Hibernation Shortcut
Lastly, I would suggest selecting the 'Create Hibernation Shortcut' from the Greenify menu. This places a "Hibernate" shortcut on the home screen, selecting which immediately freezes all the apps for which 'Deep Hibernation (by Island)' has been selected while also queuing up for normal hibernation any other apps you might have selected within Greenify.

7. Profit
The screenshot above indicates my app drawer post-hibernation and as you can see, the "all-time" enabled apps don't even cover a single drawer page (the folders only contain about 4 apps each). At the end of the day, you really don't need Maps or Uber running all the time in the background and tracking your location while draining the battery. Another illustration is the immediate memory consumption which in this example goes from 951 MB free to 1.2 GB free, just by hibernating the currently running apps. The interface fluidity and memory consumption is certainly much better by having only a limited number of running apps at any point of time.
The other benefit is that you can run dual instances of nearly every app, independently within Mainland and Island. A tip over here - it is recommended to create a separate folder (or tab) within your launcher in which you can retain the Island apps that you wouldn't like to be frozen like the duplicates of Play Store or VPN apps. It simply makes the launcher look cleaner and perhaps helps prevent confusion in case the padlock symbol against the Island app icons doesn't work for you.

8. Loss

The only downside I have seen is that the apps don't come up for update on the Play Store unless they are enabled, so be sure to check the Play Store for updates every now and then. Also, as I mentioned previously, the hibernated apps altogether disappear from the launcher and don't reappear within the folder you might have assigned to them, as they are effectively seen as new apps by the launcher on every "unhibernation", though the app data is retained. Hence, the recommendation to create the unfreeze shortcuts on the home screen.

9. Conclusion

There can be some paranoia over having an app become the device admin, especially coming from China. However, I have previously interacted with the app developer over email and have found him to be polite while immediately addressing the issues reported by me.

If you simply want the benefits of an independent work profile, you can use the Test DPC app which allows you full control over the work profile as an admin. You can also use the open-source equivalent of Island known as Shelter.

However, neither of the apps integrate with Greenify like Island and neither are able to create a work profile when the Knox bit is tripped. Hence, in my case, it is the only feasible option to keep rogue Android apps in check. In case you feel differently or have any queries, feel free to drop a comment below and I shall do my best to address the same.

Tutorial #20: Home (Network) Improvement using a Pi


Update #1 (Aug 19, 2018):

With the passage of time, things change and in this case it is mostly for the better. I was setting up the Pi once again and was pleasantly surprised that some of the workarounds mentioned previously are no longer needed. Thus, I have edited the original tutorial to accommodate these changes while striking out the old text.

Also, after comparison, I found AB-Solution to be a better solution for network-wide ad blocking, if you happen to have a Merlin supported Asus router like I do. Since the router is on 24x7, having an old ext2 formatted pen drive plugged in to the router itself and running ab-solution is a better alternative as against running the Pi 24x7. It also has various preset ad blocking files that suit different needs while pixelserv-tls does a great job with HTTPS ads.

Lastly, I failed to mention the option of having a Samba server running on the Pi itself in order to access the files directly from the pen drive over the network. This can be accomplished by simply following the official tutorial on the Raspberry Pi website.

Original Article (May 3, 2018):

I had previously shared some tutorials in setting up the Pi and putting it to good use. However, the use cases I mentioned then have ceased to exist. The Fire TV has taken care of most of my multimedia needs and I have come to realise that I really don't have much time to go back in time for nostalgia. For the retro needs that remain, the lesser-used Core M 5Y10 equipped Windows tablet of mine does a much better job plugged in to the TV.

Hence, it is time to put the Pi to good use in a different sense. Thankfully, the versatility of the Pi means that it is not difficult to identify its next project. Ads can become a nuisance, especially for the more aged members of the family and hence my first intent was to set up an ad blocker across my home network. However, putting the Pi to such limited use and keeping it on 24/7 would be quite a waste, so I decided to also repurpose it as a download box with centralised storage.

Setting up the tools necessary to accomplish these tasks seemed straightforward. However, the relevancy of publicly available tutorials diminish over time due to changes in technology. In fact, I had to put quite some effort beyond the listed tutorials and hence I have decided to put the same to words for posterity.

Before starting out:

PINN is a great utility when multi-booting across different distributions on the same SD card. However, Raspbian alone fits the bill for the current use case. Hence, writing the raw Raspbian image directly on the card is preferable as it provides more usable space. As far as writing on the card goes, Etcher is the way to go.

1. Pi-hole®

As the website so prominently displays, all you need is a single command.

curl -sSL https://install.pi-hole.net | bash
However, I made a couple of settings that are worth mentioning:

a. The predefined list of upstream DNS providers does not yet include Cloudflare which I found to be the fastest of the lot. Hence, it would be worthwhile to use the custom option and enter the Cloudflare DNS Server IP addresses of 1.1.1.1 and 1.0.0.1.

b. The other part of the equation is setting up the home equipment to use the local DNS server. In case of Asus routers, this implies changing the DNS Server IP address to the Pi-hole one, not only on the WAN page (under WAN DNS Settings) but also under LAN > DHCP Server > DNS and WINS Server Setting. Make sure that there are no other IP addresses present in either of the pages. You could also run the DHCP server on Pi-hole, in which case the latter setting is not needed. However, since I use the router-assigned IP addresses for other functions (eg. VPN), I prefer to have the DHCP server running on the router itself.

2. qBittorrent

qBittorrent has been my preferred Bittorrent client for quite some time with it being open-source and having proper support for proxies as against Transmission. It can be installed easily using APT, though I prefer the headless route.

sudo apt-get install qbittorrent-nox
My primary endeavour was to have the downloaded files ready on the USB 3.0 hard disk connected to my router (and thus acting like a NAS) while minimising the read-write operations. Since it is not a great idea to write to the SD card running the client, I decided to plug in an old 32 GB pen drive to act as the "working folder" by adding it under Downloads > Hard Disk > Keep incomplete torrents in:

The next part was to add the network drive as the final resting place by entering its address under  Downloads > Hard Disk >  Save files to location: and also under the Copy .torrent files for finished downloads to field. The latter is sometimes necessary due to some quirks in different versions of qBittorrent. The external network drive needs to be mounted within Raspbian on boot which can be accomplished by editing /etc/fstab with these details:

mount -t cifs //xx.xx.xx.xx/folder /media/NAS -o rw,vers=2.0,username=abc,password=xyz
//xx.xx.xx.xx/folder /media/NAS cifs vers=1.0,username=abc,password=xyz,x-systemd.automount

where,
xx.xx.xx.xx -> LAN IP address as configured on the router running the Samba server
folder -> Path to the folder on the network drive that needs to be mounted
/media/NAS -> Path on the Pi where the network drive is to be mounted

In my case, I had to specifically mention the SMB version (2.0), without which the mounting would fail as well as the rw argument to be able to write to the device. Also. it is a good idea to update the cifs-util package from APT prior to editing the fstab file, as mentioned above.

Note: The earlier entry no longer worked with the June 2018 version of Raspbian due to which I had to use the alternative entry in /etc/fstab mentioned above. Also, I was only able to get v1.0 working this time despite the server supporting v2.0 as well.

Finally, to cover for unexpected reboots, it is preferable to have qBittorrent autostart which can be accomplished using systemd. First, create the startup script using:

sudo nano /etc/systemd/system/qbittorrent.service
Next, enter its contents as follows:

[Unit]
Description=qBittorrent Daemon Service
After=network.target
[Service]
User=pi
ExecStart=/usr/bin/qbittorrent-nox
ExecStop=/usr/bin/killall -w qbittorrent-nox
[Install]
WantedBy=multi-user.target 
Lastly, enable the script.

sudo systemctl enable qbittorrent
3. pyLoad

The final part of the exercise was to set up a download manager. I had briefly given thought to JDownloader but decided against running JRE just for it. Hence, I opted for pyLoad instead. The tutorial listed over here works fine for the most part but needed quite some tweaks along the way. For the sake of completion, I will list all the steps in brief.

1. Create system user for pyload

sudo adduser --system pyload
2. Edit /etc/apt/sources.list to be able to install the dependencies. For Raspbian Stretch, the source URLs are as follows:

deb http://mirrordirector.raspbian.org/raspbian/ stretch main contrib non-free rpi
deb-src http://archive.raspbian.org/raspbian/ stretch main contrib non-free rpi
3. Update package list and install dependencies.

sudo apt-get update
sudo apt-get -y install git liblept4 python python-crypto python-pycurl python-imaging tesseract-ocr zip unzip python-openssl libmozjs-24-bin
sudo apt-get -y build-dep rar unrar-nonfree
sudo apt-get source -b unrar-nonfree
sudo dpkg -i unrar_*_armhf.deb
sudo rm -rf unrar-*
4.  Create symlink to get "spidermonkey" working.
cd /usr/bin
sudo ln -s js24 js
5. Unlike in the linked tutorial, I had to first switch to the stable branch before I could initiate the pyLoad setup. This is done as follows:

cd /opt
sudo git clone https://github.com/pyload/pyload.git
cd pyload
git branch -r
git checkout stable 
 6. The next step, as per the linked tutorial, should have been the initiation of the pyLoad setup using:


sudo -u pyload python pyLoadCore.py
However, doing so produced the following error: "ImportError: No module named pycurl". Hence the next logical step was to install pycurl:


sudo pip -v install pycurl --upgrade
This in turn resulted in the error: "InstallationError: Command "python setup.py egg_info" failed with error code 1 in /tmp/pip-build-IsWfyN/pycurl/". This was resolved by:


sudo apt-get install libcurl4-gnutls-devpip
 As you can now guess, this in turn resulted in yet another error: "Failed building wheel for pycurl" which was remedied as follows:


sudo apt-get install libgnutls28-dev

After all this effort, I was finally able to install pycurl using the command mentioned previously...


sudo pip -v install pycurl --upgrade 
...and execute the pyLoad setup using a slightly shorter command:


 python pyLoadCore.py
On the June 2018 release, I was able to get pyload wizard started by executing
sudo -u pyload python pyLoadCore.py 

The final tweak was to get pyLoad running on boot with the commands executed in a manner similar to what has been already covered for qBittorrent.

sudo nano /etc/systemd/system/pyload.service 
[Unit]
Description=Python Downloader
After=network.target
[Service]
User=pyload
ExecStart=/usr/bin/python /opt/pyload/pyLoadCore.py
[Install]
WantedBy=multi-user.target
 sudo systemctl enable pyload.service
Beyond Home

The benefits of this setup can be extended beyond the home network, though a lot depends on the vagaries of the network setup.

Pi-hole can be put to use on external networks by accessing the home network over OpenVPN, though speed and latency might be factors to consider. It can also be setup as a Public DNS but it is extremely risky and not at all recommended.

qBittorrent and pyLoad can be simply accessed using the IP address and port, provided they have been setup to be accessible from outside the LAN. For dynamic IP addresses, the Dynamic DNS (DDNS) option available on most routers can be put to use and my suggestion would be to pick up a cheap $1 domain from NameCheap for this purpose. However, if you happen to be under a multi-layered NAT network under ISP control like me, then there is no option other than to pay for a static IP for public access.

Thankfully, there is a last resort to access the Pi over a public network. The licensed copy of RealVNC that comes with Raspbian offers Cloud Connect that enables one to remotely control the OS and thereby all applications on it. It is quite cumbersome to use if your intent is to only load some links on qBittorrent or pyLoad, but it is better than nothing.

Thus, the Pi can be extremely useful even when used in a rather sedentary capacity and you grow to appreciate the efforts that everyone has put in to make this possible.

Tutorial #19: Optimally managing photos and videos on iOS


Apple’s anaemic storage options (specifically at the lowest tier) have been a running joke for a majority of the iPhone’s existence. I was at the receiving end of Apple’s largesse with the entry-level iPhone 7 switching to 32 GB which was further enhanced to 64 GB with the iPhone 8. Even with these storage options, one can easily fill it up with content other than captured photos and videos. Also, it is not a sound idea to have all your files stored locally on the device, irrespective of its storage capacity.

Apple provides a few options to mitigate the storage issues resulting from ever larger multimedia content. These are as follows:

1. The default option that most may take recourse to is the iCloud Photo Library. However, Apple provides a meagre 5 GB of storage for starters and as is typical of the company, you are expected to pay more to use this option practically. It only makes sense to go with Apple’s cloud if you live on it through other devices in the Apple ecosystem, like the Mac or iPad. The more important thing to note here is that by default Apple syncs your local photo library with the iCloud one, so you can end up permanently deleting your photos from the device as well as the cloud if you are not paying attention.

2. With the release of iOS 11, Apple introduced the high-efficiency formats, HEVC for videos and HEIF for images, that significantly reduce the file size on modern iDevices. The down side to this is that compatibility for this format is still not standard across platforms and devices. Most notably, the HEIF format is not yet natively supported by Windows. Even within Apple’s ecosystem, sharing the images or videos with older devices necessitates a conversion which takes up time as well as processing power on the mobile device.

3. Lastly, iOS also provides an ‘Optimize Storage’ option that keeps a lower quality version of the image on the phone for immediate viewing purpose while retrieving the full quality image from iCloud. This helps in dealing with storage issues but yet again results in the usage of additional time and data.

Luckily for iOS users, there are several third-party options available that allow one to back-up and retrieve photos and videos without having to pay or worry about running out of storage. After using quite a few options, I have shortlisted two well-known ones that together offer an unbeatable 1-2 combination. They are Flickr and Google Photos.

Before starting out, I would recommend that you go to Camera > Formats and select the ‘Most Compatible’ option which uses JPEG/H.264 instead of HEIF/HEVC. This ensures that the images are available for use without any conversion and accessible on all platforms. It will, of course, take up additional space but since we are offloading most of the stuff on to the cloud anyway, storage isn’t a constraint. On the other hand, data usage can be a constraint if you are limited to a cellular network, but the solution here ensures that even that eventuality is covered. As for the ‘Optimize Storage’ option, you can leave it enabled as iOS always provides the full quality image to any other app that requests it.

Our primary solution to the image storage problem is Flickr. One can argue that Flickr has seen better days and the Yahoo hacks might have left a few people dishevelled. Many photographers might have a preference for 500px as well, but that doesn’t take anything away from Flickr as far as our use case is concerned. Assuming that Oath (Verizon) wouldn’t bring about any drastic storage policy changes, Flickr offers the best value proposition for free users. The 1000 GB of storage space is unprecedented and the photography focus of the site is much better for image/photo management compared to a paid, storage-only option like OneDrive.

While Flickr has moved some of its tools like the desktop ‘Uploadr’ under the Pro banner, the iOS app is unaffected. It is capable of syncing with the iOS Photo Library and more importantly, uploading the original image to the cloud. It does not however support the HEIF format as is evident when you try to upload these images over the website. On iOS however, the images in the Photo Library are still uploaded after conversion to JPEG. Hence, I have previously recommended the usage of the ‘Most Compatible’ option to prevent unnecessary conversions. Unfortunately, Flickr doesn’t allow the segregation between photos and videos when uploading over a cellular connection and hence I would recommend syncing only over WiFi, unless you have an uncapped cellular connection.

The sidekick to our protagonist Flickr is Google Photos. On its own, Google Photos is an awesome product. However, ‘original quality’ images and videos are limited to the storage available on Google Drive for non-Pixel users, which in most cases is 15 GB. Luckily, Google offers an unlimited ‘High Quality’ option, which one should note, significantly compresses the image. However, thanks to clever use of algorithms and machine learning, the effects are not visible on most devices unless the image is blown up significantly.

As a secondary backup solution, Google Photos offers some distinct advantages. Firstly, it caches lower quality variants of all the images so that the entire library is accessible even when you are offline. Secondly, it offers smaller-sized files on account of the compression and resolution limitations of 16 MP/1080p, which is useful when accessing or sharing something over a cellular connection on social media. Thirdly, it allows photos and videos to be synced separately over WiFi and cellular connections, so that images can be synced immediately while larger videos can be uploaded later over WiFi. Fourthly, once images are backed up, they can be easily deleted from the device (and iCloud) using the ‘Free up space’ option. However, for this, you should ensure that the original images are first uploaded to Flickr. Lastly, the machine-learning powered search is really useful in unearthing hidden images and recreating past memories.

Thus, the combination of Flickr and Google Photos ensures that you have all your images and videos backed up regularly with redundancy and available on demand. While Flickr provides the long-term, original quality storage; Google Photos complements it with smaller-sized, optimized content for on-the-go consumption. It completely cuts off iCloud from the picture and ensures that you more storage available on your device for things that you need and use far more regularly.

Tutorial #18: Unlocking the bootloader on Redmi Note 3

As I had mentioned in my review of the Redmi Note 3, it was good value for money. However, MIUI proved to be a hindrance for the target user because of which I had switched the device to LineageOS while not rooting it and keeping the bootloader locked. However, with the phone now back in my hands, it was time to break the shackles for good.

To Xiaomi's credit, they have an official process in place for unlocking the bootloader. However, it has its quirks and more often than not following the official guide results in the process being stuck at 50% due to incompatibilities. This was the case with my first attempt and hence I decided to proceed with it unofficially.

As always, I headed to XDA to quickly gather the process for this device. However, the process seems to be a bit outdated and perhaps a bit difficult to follow for the uninitiated, so I have listed the steps undertaken by me. There could be other ROM versions or files you can use but I have mainly picked up the ones from the XDA thread linked above which you might visit in case you need visual reference.

Note: Prior to starting with the flashing process or even after flashing the MIUI ROM in Step 4, make sure you have the 'OEM unlocking' option selected under Developer Options, without which the fastboot unlocking will fail.

Step 1: Download and extract/install the following:
1. MIUI Global ROM v7.2.5.0 (You will have to extract the file twice to get the folder contents)
2. Mi Flash Tool (Official MI tool to flash ROMs - used v2017.7.20.0 at the time of writing)
3. Unlocked emmc_appsboot file (Primary file needed to unlock bootloader)
4. EDL Fastboot (To enter emergency download mode)
5. Minimal ADB and Fastboot (Installs necessary ADB and fastboot drivers)
5. TWRP Recovery (Gateway to flashing anything on the device)

Step 2:
Browse to the 'images' folder within the extract ROM folder (kenzo_global_images_V7.2.5.0.LHOMIDA_20160129.0000.14_5.1_global) and replace the emmc_appsboot.mbn file in that folder with the downloaded one.


Step 3:
After installing the Minimal ADB and Fastboot drivers, connect the phone and run 'edl.cmd' from within the extracted 'fastboot_edl' folder to boot the phone to the emergency download (EDL) mode. If you don't, then the Mi Flash tool may give a 'tz error'. This mode can be recognised by the flashing red LED on the device.


Step 4:
Run the Mi Flash tool using 'XiaoMiFlash.exe' and select the folder containing the extracted ROM files (kenzo_global_images_V7.2.5.0.LHOMIDA_20160129.0000.14_5.1_global). Clicking on 'Refresh' should list the device and then subsequently, click on 'Flash'. The process will take 4-5 minutes to complete after which you will be able to see the 'success' status.

Step 5:
Boot the phone to the normal fastboot mode using the Volume Down + Power button. Open a command prompt window and browse to the 'Minimal ADB and Fastboot' folder. Here, execute the following command:
fastboot oem device-info
It should indicate 'Device unlocked: false', following which execute the command:
fastboot oem unlock-go
Running the 'fastboot oem device-info' command once again should now indicate 'Device unlocked: true'. That's it, your device now has an unlocked bootloader.

Step 6:
While not part of the bootloader unlocking process, a follow-up step should be to flash the TWRP recovery which opens a whole window of opportunities. This can be done by copying the 'twrp-3.1.1-0-kenzo' (latest file at the time of writing) to the 'Minimal ADB and Fastboot' folder and running the following command from the cmd window:
fastboot flash recovery twrp-3.1.1-0-kenzo.img
Following this, you will have complete freedom to tinker with the device in any way you deem apt. Oreo, anyone?

Tutorial #17: A batter(!) understanding of dosa economics


Raghuram Rajan has been featuring a lot in the media recently in promotion of his book 'I Do What I Do'. While I am yet to read it, there has been no escaping it as select excerpts and anecdotes have been making their way to the news every day now. Earlier today it was the turn of 'Dosa Economics' on BBC.

On the face of it, it is a simple concept of understanding the real interest rate as against the nominal one. Most people tend to look at interest rate in absolute terms since it is the most visible one and inflation as the silent killer is rarely understood. It was a noble attempt by Rajan at explaining this concept, though how many pensioners received the message even after the simplification of numbers is debatable.

However, I see no reason for Raghuram Rajan to have a monopoly on dosas in economics. Moreover, one would be hard pressed to find a dosa for ₹50 in a city like Mumbai, let alone 1-year fixed deposits at 8% and real-life consumer inflation at 5.5%. So, now you get to create your own realistic dosa economics, provided you have the appetite for it.

Open in new tab

Tutorial #16: The Securities Trade Lifecycle


Trade is one of the basic tenets of investment banking. Yet, detailed information on its lifecycle is not easily forthcoming on the web. There are of course articles available describing the same at a high level, though it seems that few agree on the exact terminologies and sequence to be used. Books on the other hand, even the eponymous ones, divert to discussing trade strategies rather than the lifecycle itself.

They key then seems to be in garnering the details of the process. Having gone through a number of sources, I can recommend the tutorial available on Udemy. It covers the lifecycle starting from trade execution, though some would prefer to start from sale or trade initiation. What's important is that it covers all the steps in sequence and builds up on the details chronologically with illustrative examples. This facilitates a far better understanding than what most words would do.

It is certainly not free, but worth its price or indeed a trial. The tutorial spans over 22 videos (with accompanying PDF files) and may take up to 4 hours of your time, if you are running it an 1x speed. I found it convenient to follow even at 1.6x, so your mileage may vary. While this may sound more of a review than a tutorial, especially compared to my other ones, I found it best to classify it as one since it is aimed at promoting learning more than anything else.

Tutorial #15: How to get official licenses (Windows/Office) for cheap


There was a time when piracy was considered to be a necessity. The unavailability of the software locally along with dollar pricing made it impossible for anyone to even contemplate purchasing the software. However, things have changed a lot since then. Local availability along with local pricing has made these products far more accessible.

But, and the big but, is affordability. The pricing is certainly competitive from a commercial perspective but personal users would still find the price prohibitive, especially when the usage is limited to writing personal documents and filing income tax returns. Piracy can't be condoned, so what other valid options are available?

By valid, I refer to the ability to download and register the software using official sources. I remember getting a Windows 8 license for less than $10 during its launch due to a Microsoft promotion and I wish they were generally generous in their pricing in developing countries. However, I presume Internet anonymity has made it difficult to separate the wheat from the chaff.

Hence, it becomes necessary to take recourse to other options. One of the most prominent ones is the Microsoft Software Swap Marketplace on Reddit, though there are other forums available. The prices are certainly competitive compared to retail pricing but still on the expensive side for those not dealing with USD as local currency. Hence, the best option is to head to good old ebay.com. The price fluctuates from seller to seller and availability is entirely dependent on timing. However, if you are in luck, then the prices range in low single digits, as far as USD is concerned. Local credit cards might not work directly with ebay.com but PayPal comes to the rescue. Do keep in mind to use your bank conversion since the fees are usually much less than PayPal's, the premium mostly being less than 5% depending on the size of the transaction. The proclamation is that these codes have been salvaged from scrapped machines and hence it is legitimate to resell the same. What I can confirm is that the codes work fine with office.com and are instantly redeemed along with the download links for any Microsoft account. Similarly, Windows activates just fine with the supplied key, if used with a fresh installation.

If you prefer Office 365 instead, then there are Educational subscriptions available that offer multiple year access for about the same price. This one certainly feels a bit dodgier because you are restricted to an academic email address being governed by administrators. However, it offers multiple installations and 1TB of SkyDrive space, though it is difficult to trust an address you are not entirely in control of.

Whatever be the case, there are certainly legitimate options available that if nothing else help protect from options that are untrustworthy and laden with malware, at a significantly affordable price.

Tutorial #14: Tips on a fresh installation of Windows Creators Update


For all the cruft that had built up since the Anniversary Update, I decided to do a fresh installation of the Creators Update released earlier this month on my tablet (Dell Venue 11 Pro 7140). Re-installations are much easier now than a decade ago with cloud backups eliminating the worries of losing data. However, it still takes some effort to reduce installation time and to ensure that Microsoft's data acquisition and bloatware installation is limited. Following are my tips learnt from experience.

1. Direct ISO downloads are a thing of the past as far as Microsoft is concerned and it wants you to rely on the Media Creation Tool instead. That's fine for the most part, but it happens to download a device specific ISO. So, if you happen to have multiple Windows devices, especially with different editions of Windows 10, then it is best to download the international ISO.

2. USB installations are undoubtedly faster than disc based ones. However, SSDs are much faster than standard USB flash drives. In my case, I have converted a discarded 64GB M.2 SSD into a USB 3 flash drive which reduces the initial installation process to just 5 minutes.

3. Rufus is by far the best tool to write the ISO to the USB drive. In order to write to SSD flash drives, make sure you enable the 'List USB Hard Drives' option.

4. After installation, when you boot to the profile setup screen, I would recommend not connecting to the Internet. This causes the PC to reboot into the offline setup mode and you don't need to link your Outlook account. It is said that an offline account limits the telemetry sent to Microsoft, though I can't vouch for it personally.

5. For additional privacy, you can disable all the privacy options presented on the setup screen. If you need any of them, you can always enable them later.

6. Make sure you keep the drivers from your manufacturer handy before the installation. You can copy them to the USB drive that you use for the Windows setup. In my case, Dell provides a single CAB file containing all the drivers and I usually place the extracted CAB file on my USB SSD drive for easy access. This enables the manual installation of the correct drivers using the Device Manager.

7. A very important step is to disable installation of hardware drivers from Windows update in case you already have all the manufacturer drivers. In my case, I found that the drivers from Microsoft for my device caused a lot of issues, especially with the display and battery management. Hardware driver installation can be disabled from Advanced System Settings > Hardware > Device Installation Settings.

8. Microsoft also tends to install a lot of sponsored apps like Candy Crush Saga on the device as soon as you connect to the web. Hence, it is a great idea to open the Start Menu and remove all the icons for the apps that are awaiting download. Note that you can only do this in case you didn't connect to the Internet during the setup process.

9. Whenever you login to Microsoft Apps like Mail or OneDrive, make sure that you sign in only to the app and not associate it with the Windows account. This again ensures better privacy and account management.

This just about covers the most important things to keep in mind when undertaking a fresh installation of Windows. It maximises privacy and minimises the conflicts that you may encounter, thereby streamlining the installation process.



Tutorial #13: Installing (sideloading) Kodi and other open source apps on iOS without jailbreak


Apple is renowned for its walled garden which has its upside in terms of security but prevents users from accessing apps that are not on the App Store. Thankfully, Apple opened a small gate in the fence a couple of years ago which allowed open source applications to be signed and installed by the user for personal use. This opened the door for the sideloading of apps like Kodi and emulators like nds4ios which would otherwise never make it to the App Store.

The long way to do this would be to download the source code for the app, compile it in XCode and then transfer the app to the phone. The catch here is that you need a Mac for it or in the least a Hackintosh, besides some patience. However, this is also the only way to run all available open source applications, a handy list of which is available at this link. The official guide to compiling Kodi can be accessed here.

A much easier option is to use Cydia Impactor. It is much less time consuming and easier to execute, as illustrated in this guide. Again, the rider here is that you need the already compiled IPA files to sign them with your own certificates. Many of the prominent ones including Kodi and most emulators are available here.

With Apple, there is always "one more thing" and it is true here as well. The free developer certificates last only a week, so it means that the process of sideloading the app has to be repeated every week. It is definitely an inconvenience but by design as it reduces the scope for piracy and bypassing of the App Store. The pain can be alleviated if you go for a paid developer account ($99/year) which allows the app to be installed till the certificate expires at the end of a year. Lastly, if you fancy coughing up $10/year, then you can go along with the BuildStore which allows you to directly install select apps from the browser with the certificates lasting for nearly a year.

Of course, apart from accessing Kodi and emulators, this option also allows one to create and test one's own apps without coughing up anything which is incentive enough for me to have a go at creating an iOS app as well for the website.

Tutorial #12: Converting calendar (CY) dates to fiscal year (FY) and quarter


I had a recent request from a colleague who wanted to arrange a bunch of dates in to fiscal quarters. So, I went about creating an Excel workbook  that would implement the same with the freedom of selecting the starting month of the fiscal year and decided to post it here along the lines of my previous tutorial. The following steps explain the logic behind my implementation.

Note that the steps below refer to the formula in cell C5. Hence B5 refers to the input date whereas the fiscal start month is captured in the cell C2.

Step 1: Input for FY Start Month

The 'FY Start Month" is the only input to this sheet and enables adaptation to any fiscal year. To prevent errors, I used Data Validation to limit the inputs to whole numbers ranging from 1 to 12.


Step 2: Calculating Fiscal Year

To calculate the fiscal year I used the simple logic wherein if the month of the date is equal to or greater than the fiscal starting month then the fiscal year is incremented by one compared to the calendar year or else it remains the same.
IF(MONTH(B5)-$C$2>=0,YEAR(B5)+1,YEAR(B5))
This works for all scenarios apart from when the fiscal year is same as the calendar year since in that case we have to create an exception where the fiscal year is same as the calendar year. This is done with the help of the additional IF statement.
IF(MONTH(B5)-$C$2>=0,IF($C$2=1,YEAR(B5),YEAR(B5)+1),YEAR(B5))
Step 3: Calculating Fiscal Quarter

To identify the quarter, I decided to go with the CHOOSE function which makes it imperative that the calendar months are rearranged to fiscal months.

The difference in the numerical value between the calendar month and the fiscal month can range from -11 (1 minus 12) to +11 (12 minus 1). Hence the logic below offsets the value such that it lies between 1 to 12. This is done by adding 13 whenever the difference is negative and adding 1 whenever the difference is positive.
CHOOSE(IF(MONTH(B5)-$C$2<0,13+MONTH(B5)-$C$2,1+MONTH(B5)-$C$2),"Q1","Q1","Q1","Q2","Q2","Q2","Q3","Q3","Q3","Q4","Q4","Q4")
Step 4: Combination of fiscal quarter and year

The final step is to join the two formulae with the fiscal quarter leading the fiscal year with suitable spacing.
CHOOSE(IF(MONTH(B5)-$C$2<0,13+MONTH(B5)-$C$2,1+MONTH(B5)-$C$2),"Q1","Q1","Q1","Q2","Q2","Q2","Q3","Q3","Q3","Q4","Q4","Q4")
&" "&
IF(MONTH(B5)-$C$2>=0,IF($C$2=1,YEAR(B5),YEAR(B5)+1),YEAR(B5))
Since Google Sheets supports the same semantics as Microsoft Excel in this case, you can access the same using this link.

Tutorial #11: Thermal control of Core M 5Y10 (Broadwell)

The first generation Core M was an engineering marvel in terms of the power it managed to fit within the 4.5W TDP envelope. In terms of performance, it sat somewhere in between the 5th generation U-series Core i3 and i5 which bear TDP of 15W but blew both of them out of the water when it came to efficiency. The fanless design not only cut down the weight of the tablet but in fact removed a significant point of failure. My gut feeling is that whirring fans are not a good fit for use cases which involve significant movement à la tablets. On the flip side, the processor package is nothing short of a toaster with its sky high temperature under sustained full load.

This led me to fire up Intel's Extreme Tuning Utility (XTU) on my Dell Venue 11 Pro 7140. Normally, devices in a portable form factor like tablets and laptops leave very constrained thermal dissipation and hence are not good tuning candidates. Hence, unsurprisingly, only the voltage and turbo power controls are on offer in the case of the Core M. After trying out over two dozen profiles, I settled on the following three detailed in the table below wherein I have listed the changes made to the default (reference) settings. Profile 1 aims at a temperature of mid-60s under turbo boost which is akin to what you may find in fanned processors. Profile 2 has turbo boost set to the TDP and allows for sustained usage without thermal throttling. Profile 3 on the other hand is the default profile but with stable under-volting that reduces the temperature just enough to limit instances of thermal throttling under sustained full load.


To check the impact of these profiles, I have used the benchmark within XTU (XMarks) as well as the CPU stress test (duration of 1 min). Additionally, I have used the CPU Mark and 3D Mark tests of PassMark as they seemed to be particularly responsive to the changes. Lastly, any CPU test would be incomplete without CPU-Z and hence its bench also makes an appearance.


As can be seen from the table above, there is a compromise to be made between temperature and performance depending on what floats your boat. In my case, I decided to go with 'Profile 3' for now since sacrificing power in a mobile device is always a tough choice. Even then, it is an improvement over the default profile in terms of performance as well as temperature. Profile 2 seems like an especially good option in case thermal throttling is a major concern while Profile 1 plays it really cool if you cant't warm up to the idea of using the tablet as a finger heater. Overall, I am to this day impressed by the Core M package, enough to have it don a triple avatar.

Tutorial #10: OpenVPN on a VPS with Ubuntu 16.04

As you may have read in my previous post, I was more than a tad disappointed by the performance of PIA. The inconsistent speeds along with geo-blocking made it really difficult to stream HD content. For me, it became a case of "if you want it done right, then do it yourself" and hence I went about setting up my own OpenVPN server. As always, I am not reinventing the wheel here by rewriting what is already well written, but recounting my experience of it. This should be of some assistance to anyone looking to get their hands dirt as well.

Step 1: Get a virtual private server (VPS)

As with real estate, selecting a VPS can turn out to be all about "location, location, location". One of the reasons might be to ensure a low ping which is a necessity for audio, video services as well as for gaming. The other one would be about the identity that you want to assume "virtually". With geo-blocking, you might want to be based off a specific location that offers some advantages in terms of the services you can access in that country.

Having decided the server location, the next factor would be the cost. If running a VPN server and some other personal services is the priority, then you can get away with having a single virtual core processor and half a gig of RAM. The storage available will vary on whether it is hard disk space or SSD cached or SSD itself. I wouldn't recommend hosting anything off such a server and hence tens of gigs of storage space is usually more than sufficient. Pricing for such a configuration is unlikely to lighten your wallet by more than $1/month, so it might be a case of money well spent.

Once you have the server in place, most service providers offer one-click OS installations. Debian is undoubtedly a popular choice but I found its repository to be outdated compared to Ubuntu. Since we are not running a really professional operation here, we can make do with the stability of Ubuntu at the benefit of having access to newer software packages through apt.

Step 2 (optional): Get a domain

This step is purely optional as such a setup is unlikely to be broadcasted to the world and you can make do with using just the IP address. Having a domain then is just a matter of convenience. For me, namecheap's $0.88 offer was too good to pass. An alternative would be to use DDNS services, in which case you might have to renew the address periodically.

Step 3: Setup the OpenVPN server

I usually end up perusing the tutorials put up by DigitalOcean for setting up various aspects of a VPS and this time it was no different. Hence, I will just share the link to the tutorial.

However, there are a couple of things that have to be kept in mind:
1. Although the tutorial is written for a non-root user, there are some instances where the "sudo" command is omitted and you have to take care to include it, barring which the commands will simply fail.
2. Even after following the tutorial to a tee, you might encounter errors when starting the service using systemctl in Step 9. This is resolved by commenting out the line mentioned here.

In terms of performance, there is no bottleneck or geo-blocking that I have encountered till date which is an instant improvement over PIA. Of course, having a VPN service with servers in multiple countries is still beneficial but then a VPS allows you to do a lot more things through remote shell access. The two simply complement each other and is a good toolkit to have on your side in this age of constant data accumulation and monitoring.

Tutorial #9: The perfect (IMO) multi-boot option for the Raspberry Pi

I had previously stated the myriad of ways in which the Raspberry Pi can be put to use. While utilizing separate SD cards for each OS is a viable solution, it is a bit cumbersome and perhaps expensive if you don't have a lot of spare SD cards lying around. Hence, what's better than being able to run a whole bunch of OSes from a single card and being able to switch seamlessly between them. A point to note is that a LOT depends on the speed of the SD card and I have settled on using the Samsung Evo + which might be a bit of an overkill for the Pi but it ensures the fastest experience.

In my case, I had the following requirements of a multi-boot system.
1. First and foremost, Kodi. It is a powerful media player by itself but what makes it truly worthwhile are the countless add-ons that significantly boost its functionality. The Pi makes for a formidable media centre with its hardware x264 decoding and finds a lot of its utility online.
2. An OS to execute the various Pi projects. A CLI may do but a GUI makes things a lot easier.
3. A retro games emulator. Who can beat nostalgia?

While there are multiple ways to fulfill each of the requirements I mentioned above, my evaluation lead me to settle on the following:

1. Kodi - OSMC: OpenELEC, LibreELEC and OSMC are the standalone Kodi OSes available and while Kodi is Kodi, there are still some pros and cons to each. OpenELEC and LibreELEC operate on a "just enough OS" principle wherein the installation contains no extra fluff. This certainly speed things up but may limit the extent to which you can play around in the background with SSH. LibreELEC is a fork of OpenELEC and the situation is akin to what happened with Open Office wherein most developers have eloped to the "Libre" camp, so one might imagine that it would see more proactive development going ahead. However, after trying out LibreELEC, I settled with OSMC, mainly because I was having a hell of a time with Bluetooth on LibreELEC. Also, the OSMC skin looks much more refined and the vertical interface is much easier to use when you are left dangling with a TV remote. Moreover, while I have only used SSH to enable OpenVPN, I am pretty sure that OSMC will allow for a lot more tinkering that the "ELECs" if need be.

2. OS - Raspbian: Yup, I settled with the de-facto OS. The project support for this is unparalleled which certainly makes things a lot easier. Ubuntu Mate comes in at a close second if you are looking for a true "desktop" experience but it comes at the expense of speed even though it is compiled for ARM v7 as the MATE desktop environment is more resource hungry. Lubuntu is a good alternative but you lose out a lot of the Ubuntu look and easy customization. Having said that, the Pixel desktop environment has made Raspbian a lot more welcoming and easy to use. The built-in license for RealVNC as well as the GUI for Raspi config are like icing on the cake.

3. Retro gaming console - Recalbox: RetroPie is the more popular option here but in my testing, I found it to behave a bit irrationally. The configuration done for RetroPie didn't map properly to many of the emulators. Also, the bluetooth interface kept on throwing errors wherein I couldn't remove an existing device because it wasn't listed and neither could I add it since it was stated to be already present. In that sense, Recalbox is more user friendly. The interface does hide more of the advanced options for RetroArch, but it worked well for me out of the box and was more reliable in general.

With my options in place, the next thing was getting them to play ball on a single SD card. To that end, I considered the following options:

1. NOOBS: The option that the Raspberry Foundation themselves provide. It is quite easy to add offline OSes by adding them to the "os" folder on the SD card and then selecting them at the time of installation. The downside is that by default the OS selection requires the use of a keyboard since it doesn't support HDMI-CEC. Since I was looking at a wireless option, I had to leave this one out in its unadulterated form.

2. BerryBoot: It certainly solves the OS selection problem by supporting HDMI-CEC and allowing the use of the TV remote to select an OS. The OS images have to be modified to be compatible with BerryBoot but thankfully there is an updated repository of all prominent OSes maintained here. Unfortunately there are some limitations to this approach. It uses SquashFS, a compressed file system that limits performance and also prevents one from making automated system level updates.

3. Multi Boot Pi: This eponymous option is the first thing that pops up on Google and is indeed a great option built on NOOBS. The built-in OS switcher makes it easy to switch between OS and also set the default boot option. I had selected the latest available download at the time of this writing which offers quad boot support. I presumed that being based on NOOBS, I can avoid installing RasPlex and that doing so would adjust the menu entries in the respective OSes. However, that is not the case. RasPlex hangs around no matter what and requires manual editing to remove the references. Also, updating the various OSes led to a change in the look of the Kodi Program launcher for the other OSes and I presume future updates might break it altogether. Even otherwise, the RetroPie option that this goes with just wouldn't work for me.

4. Matt Huisman's Dual Boot: This is built on the same principle as Multi Boot Pi, but includes Recalbox by default which is what I prefer. Although the default installation is for dual boot, adding Raspbian was as easy as copying the folder that comes with the Raspbian NOOBS zip. It boots to OSMC by default which is again as per my preference, though that can be edited easily as well. The site also includes the necessary scripts including the one on adding Raspbian to Kodi, so any unexpected link breakages can be easily rectified. Having said that, I updated all the OSes without any adverse effects. It would also be pertinent to note that you can and should adjust the partition sizes for each OS as per your requirement. I have kept about 2 GB for OSMC, 8 GB for Recalbox and the rest for Raspbian.

So, that's it as far as my setup is concerned. It strikes the perfect balance between all that I require of the Pi with convenience to boot. (pun intended.)

Tutorial #8: Elegantly dual booting Windows and Android (Remix OS) on PC using UEFI


The ability to dual boot Android with Windows has always been an enticing prospect and Remix OS made doing so on a desktop/laptop a much more practical one. While Android x86 project has been around for a long time, its support and stability has always been spotty. Moreover, stock Android, in spite of natively supporting keyboard and mouse, doesn't feel very intuitive to use on large screen devices. Remix OS, although it based on Android x86, feels much more suitable for this use case. It has its fair share of bugs and one is certainly entitled to question the utility of using Android on a PC, but it exists nonetheless and there is no reason to not try it out.
Out of the box, the 64-bit variant of Remix OS supports dual booting with Windows and comes with its own GRUB bootloader. But my experience with it hasn't been the best because of the following reasons:
1. On more than one occasion, the 'Windows' entry in the boot menu simply stopped working. This is more than inconvenient because it means have to boot up the Windows installation for recovery. Moreover, fixing of MBR through the command line doesn't work and neither does start-up recovery, meaning you have to reset your PC to have access to your files.
2. Editing any boot parameter like timeout and default entry means playing around with the various grub config files in a text editor and hoping you don't mess things up.
3. Two lines on a black screen for OS selection is not the best way to relive the 90s.

In light of all these points, I found the most elegant option to be the use of a separate GRUB EFI bootloader. In case your primary OS is Windows, the easiest option is to use Grub2Win which is easy to install/uninstall and has a GUI to boot (pun intended). On a side note, you might find references to EasyBCD over the web in relation to Remix OS, but the fact is that it only works with the legacy mode which means foregoing some of the benefits of UEFI, mainly speed and security.

The steps involved in setting up the dual boot process is as follows:
1. Install Remix OS as you would normally using the supplied tool from the official site (say C:\RemixOS)
2. Make a back-up of the RemixOS folder which you just installed (say C:\Remix)
3. Now, uninstall Remix OS. The net effect is that your PC is now back to its pre-Remix state but you have the necessary files along with a formatted data partition of your choosing.
4. Next, install Grub2Win to the same partition where you have kept the Remix OS files (say C:\grub2)
5. Run Grub2Win and and install the necessary EFI modules using the 'Manage EFI Partition Modules'. Next, go to 'Manage Boot Menu' and select 'Edit Custom Configuration Code' for Menu Entry 1 if already present or select 'Add a New Entry'.
6. For OS Type, select 'remix' (or 'remix-nvidia' if you have a Nvidia GPU). In the 'Automatically generate Configuration Code for Remix' section, ensure that the proper path (/Remix/kernel) is selected against the 'Partition Search by File' radio button. You can play around with the other configuration options but there is no need to further edit the configuration code.
7. On the main screen of Grub2Win, if you had been booting only Windows until now, you should change the 'Windows boot timeout seconds for BCD' to "0". Again you can play around with the other timeout and theme options.
That's it, you should now have a perfectly functional, customizable and attractive boot selection screen. Moreover, you can easily set 'Windows Boot Menu' as the default boot option in UEFI so that your PC skips Grub2Win altogether unless you invoke it using the one-time boot menu (F11 in my case) and select Grub2Win EFI.