Musing #75: Analysing the Valencia Formula E debacle

 


Some may call it exciting and others farcical. For a fledging series, Formula E certainly attracted the wrong kind of attention with the slipshod finish yesterday resulting in only 9 drivers being classified at the end. While it is easy to call out the FIA or Da Costa for the extra lap, the reality, as always, is more complex. This analysis thus aims to clarify the events as they unfolded.

The first big question that I came across on the web was about the big change in energy reduction percentage during the various SC periods. While there was only a 3% reduction during the 3-minute safety car (SC) at the 20-minute mark, the final 5-minute SC resulted in a 12% reduction. Well, this is quite easy to explain keeping in mind the starting available capacity of 52 kWh. Essentially, the percentage value displayed on the screen is a relative value whereas the absolute reduction is happening in terms of kWh, both to the usable energy as well as available energy.

At 20:38 remaining, we can see that the available energy is 61%.

Moments later, it drops to 58% after the reduction.


The reduction itself is 3 kWh for that SC period and a total of 9 kWh for the race.


This latter part is the most important as it indicates the total available energy after the reduction. Back of the envelope calculations for this scenario is as follows. Note that these calculations are based on the whole number figures displayed in the TV graphics while the actual numbers with the correct precision would be slightly different.

Usable energy before reduction: 46 kWh (52 - 6)
Available energy before reduction: 61% of 46 kWh = ~28 kWh

Usable energy after reduction: 43 kWh (52 - 9)
Available energy after reduction: 58% of 43 kWh = ~25 kWh (28 - 3)

Now let us take a look at the final SC. The available energy is 18% prior to reduction.

It drops to 6% after reduction.

The total energy reduction is 19 kWh in total and 5 kWh for that SC period.

The calculations now are as follows:

Usable energy before reduction: 38 kWh (52 - 14)
Available energy before reduction: 18% of 38 kWh = ~7 kWh

Usable energy after reduction: 33 kWh (52 - 19)
Available energy after reduction: 6% of 33 kWh = ~2 kWh (7 - 5)

As you can see above, the percentage value can be quite confusing for the viewer as both the numerator and the denominator change by the same amount and thus the change in the actual percentage value will be more drastic for lower energy values than the higher ones.

That explains the TV graphics but then why were the teams caught so unawares towards the end? For, let us go to the period just as the final SC came out.


At this point, Da Costa has 22% of 38 kWh usable energy i.e. 8.4 kWh. The fastest lap as can be seen in the office notice was about 1m 40s. It thus indicates that at this point, there was enough time to cover 5 laps.

However, when the SC came out at 5:38 remaining, there was still a part of the lap remaining. Luckily, Da Costa floored the car at about the same place after the SC, so we can easily make out the time needed to reach the finish line from that part of the track.


As can be seen from the images above, it takes about 25 seconds implying that Da Costa would have crossed the line with 5:13 remaining if there was no SC. Considering the fastest lap of the race, 3 laps would have taken around 5 minutes. Thus, if Da Costa was to complete 3 more laps in 1:44 to 1:45 minutes, he would have needed to complete only 4 laps to finish the race with about 8 kWh remaining which is perfectly feasible.

The problem then is that the SC pace was not enough to scrub off a lap and thus the cars still had to complete 4 laps to finish the race, around 2 laps under the SC and a little more than 2 laps at race pace. While the race pace target was 2 kWh/lap, under the SC, assuming a lap time of 2m 30s, the reduction would have been 2.5 kWh/lap. This implies that the cars lost around a kWh of energy behind the safety car due to the time elapsed and at the same time utilised closed to 1.5 kWh of energy following the SC.

Thus, while in race conditions, Da Costa was expected to have 4.4 kWh of energy at the point where he started the final run with 2 laps to go, in reality he had about 2 kWh. The only feasible option was to limit the race to one lap after the SC which he was unable to do so. At the same time, Mercedes seems to have a 5-lap target in mind before the SC and thus were keeping more energy in hand compared to Da Costa who was hoping to limit the race to 4 more laps at the time the final SC came out.

While it was a shambolic end to the race with the FIA shifting blame to Da Costa for not controlling the pace, the FIA is not without blame. They had never made a provision for such a scenario and the fixed reduction of 1 kWh/min applied by the FIA is excessive when the usable energy is low.

One way of tackling such a scenario could have been to reduce the energy allocation for a race lap over a SC lap duration (i.e. ~2 kWh reduction over a 2:30m SC lap). This results in a reduction rate of 0.8 kWh/minute. If this seems too low a reduction, then the rule can be changed to apply this limit only for the final 10 minutes of the race.

The other option could have been to not deduct the energy consumed by the car behind the SC which seems to be about 0.3 kWh/minute (can be calculated from the fact that usable energy reduced from 22% to 18% behind the 5-minute SC with 38 kWh available energy). However, this would result in a reduction rate of 0.7 kWh/minute and thus even more benevolent that the previous approach which is in fact more practical as it takes in to account the energy consumption by the race car and SC for a specific track.

Will this situation be addressed? That is anybody's guess as Formula E certainly seems to be quite disorganised at present. However, the solution to the problem is available as highlighted above and all it needs is the FIA to act on it. Most probably though, I think the teams will adjust the software to account for the SC loss going forward and we might see a slow but secure finish if such a scenario arises in the future.

Musing #74: Designing Fossil Hybrid HR Watch Faces


  

Time pieces move on with time and the Fossil Hybrid HR happens to be the latest one occupying my wrist. I always have the urge to customize whatever I can get hold of and the McWatchFace was the first in this regard concerning a watch face.

Following that, I moved on to the Fitbit Versa 2 but I could never invest enough time to create a watch face for it.  I had the yearning for a more traditional watch, but ever since I started using fitness trackers, I can't completely wean myself off it. To that end, I got hold of the Fossil Hybrid HR a fortnight ago.

Having never had a Pebble in its heyday, this watch has been a revelation for me in terms of tinkering as it doesn't require much to design something new and share it with the world at large. With that, I present my first watch faces for the Hybrid HR. If you already possess this watch, then feel free to access these watch faces using the Fossil store and let me know if you have any ideas worth implementing.

6 B&W










Musing #73: Raspberry Pi 4B - SD Card vs SSD



Earlier this year, I bid finally bid adieu to my Raspberry Pi 2 in favour of the Raspberry Pi 4B. The RPi 4B certainly opens up new horizons with its additional power, though I don't suppose it is a desktop replacement as some marketing material would have you believe. I couldn't care less about that aspect as it is meant to be more of a hobbyist product, though I did switch over to MATE desktop environment from the kiddish-looking LXDE environment that the Raspberry Pi OS comes with by default.

What I was looking for most is the general increase in performance and with the BCM2836 SoC in the Pi 2 v1.1 not supporting USB boot, this was the first time I could boot off the USB. I immediately jumped on to the 64-bit beta back in May along with the EEPROM update that allowed booting off a USB drive.

I already had a Sandisk X110 M.2 SSD with me from an older tablet and a M.2 to USB enclosure. Unfortunately, I quickly realised that the enclosure was not up to the task as even loading the boot files failed repeatedly. It seems the controller on a cheap enclosure isn't really that good (who would have thought?), so it meant getting another one instead. I went with a ORICO one this time, not expecting it to be great but at least better than the one I had since it cost 3 times as much. Sure enough, it did the job.

So how fast is the SSD over the SD card? Unsurprisingly, it is quite significant. The lower power of the processor on the Pi gives a better idea of the difference made by SSD alone, though with it being limited to a shared 5 Gbps interface, the full extent wouldn't be evident if you plug in another USB 3 device or wastefully plug in a NVMe SSD instead of a NGFF one.

Long story short, unless you need your Pi to occupy as much less space as possible, it makes sense to boot off a SSD instead of a SD card. Also, if you have a good case, like the Argon One I picked up recently, it is possible to overclock it to the maximum 2.147 GHz without voiding the warranty and still keep the temperatures lower than the stock frequency without cooling. All this does tempt me to give the Pi a go as a daily driver, but for now there are many other creative uses for the device that take precedence. Until next time, Godspeed!

Musing #72: R.I.P. A50

Over the past few months, I had made multiple posts on the Samsung Galaxy A50, be it a short review, initial analysis of the super slow-mo or a guide about making the most of the mid-range hardware. Unfortunately, all average (good-ish?) things come to an end and in this case in ended with my A50 being lost. The driver of the vehicle in which the phone was left behind gave me some hope in picking up my call, but what followed left me with a little less faith in humanity.

However, life goes on and move on I have. At the same time, I have no emotional attachment to any material possession, so this post is not an eulogy on the A50 but rather a short post on what can be done to make the most of the situation where the phone is lost.

Samsung puts a fair amount of bloatware on its phones but one piece of software that is genuinely useful is "Find My Mobile". This feature is markedly better than what is offered by Google and there are several options for dealing with the lost device besides simply tracking it like erasing the device, ringing it, retrieving recent calls/messages and extending the battery life. Unfortunately, my trust in the driver led me to not immediately open the tracker which in turn ensured that the device was never again switched on with my account activated.


With the horse having bolted from the barn, there can be some solace found in rendering the device useless, well, as a phone at least. The Department of Telecommunications (DoT) in India launched the Central Equipment Identity Register (CEIR) earlier this month which is supposed to make the blocking of the lost phone as easy as snapping of fingers.

Unfortunately, as with any government initiative, things sound much better on paper and on websites than in reality. I went through the process of lodging a police complaint at the place where the phone was lost with the expectation of making the most of this lifeline afforded by the DoT in terms of being able to take some action on the lost device. As it turns out, while the website correctly verifies the IMEI using the dedicated tool, the form itself fails to submit with the error stating the absence of data for the IMEI. A really shoddy implementation by C-DOT backed by an equally appalling lack of response on social media. I would still give them the benefit of the doubt considering it has been launched as a pilot project, but hope they would be inclined to fix the website eventually.

Even every misfortune is worth the experience and I would say this is a lesson well learnt. A bit more practicality over trust in humanity might have saved the day. Hopefully, this post would equip you to better handle such a scenario in a far better manner than I did. See you until my next mobile adventure.

Musing #71: Samsung Galaxy A50 Super Slow-Mo and Night Mode


A little over 24 hours ago, Samsung introduced the Super Slow-Mo and Night modes to the Galaxy A50. While Samsung does an impressive job with camera improvements on flagship devices, I had my expectations pared down for the A50.

With all reviews down and dusted for the device at the time of its launch, it is unlikely that anyone other that someone who owns the device would test these features on a short notice. Hence, here I am with this post.

Super Slow-Mo Mode:

The A50 had a Slow Motion mode since launch. That mode recorded 720p video at 240 frames per second and played it back at the same rate. Hence, the super slow-mo mode was a bit of a mystery since there was no official mention of what it comprises of.

The super slow-mo mode in the S9 managed to do 960 FPS for 0.2 seconds. It seemed unlikely that Samsung would push a mid-range device that far even though it has quite a capable chipset. The marketing material mentioned the Exynos 9610 as being capable of recording Full HD at 480 FPS but was unlikely to happen.

The best way to find out what a new mode does is to test it out. Since there is nothing better than watching a (digital) stopwatch in slow motion, that's what I did. The process of recording itself gave no indication to what was actually happening since it would take over 2 secs for the camera to start saving after initiation of recording, with the saving process itself taking longer.

Normally checking the metadata would sort things out, but in this case the output was clasified as a 8m 33s, 30 FPS video; nothing abnormal about it but for the fact that it was supposed to be a super slow motion video. Thankfully, this is where the rather vapid stopwatch came to the rescue.


As can be seen in the video, the actual super slow motion part of the video lasts for about 0.4s, from 0.69s to 1.09s. The video itself  contains 250 frames, so to accommodate 0.4s of super slow motion implies that the recording rate was 480 FPS as it constitutes 192 frames (480 x 0.4). The remaining 58 frames are created courtesy of normal 30 FPS recording preceding and following the super slow-mo part of the recording.

It's great having super slow motion video but to have it at 720p when the chipset is capable of 1080p is a let down. But then, considering the struggles of the sensor to capture light even at 720p, it seems that a 1080p clip might end up being downright unusable. That Samsung has even bothered to add this mode to this device is a huge plus since few would have expected it.

Night Mode:

The clamour for Camera2 API for the A50 has been incessant, if for nothing else, than the ability to use Google's incredible Night Mode. However, it is unlikely that Samsung would ever accede to that demand. Instead, A50 owners get Samsung's take on the Night Mode which was always likely to be somewhat credible rather than incredible.

As always, in matters of camera, it is more apt to let the images do the talking. The rather compressed collage below gives an indication of how the various camera modes deal with extremely low light. It wouldn't take a detective to find out which one is which, so I'd rather take the easy way out of not labelling any of the images. However, for the purpose of verification and lack of astonishment, I have uploaded the original images with rather curt labels at this link.


Musing #70: Early days (of review)

It would be in good humour to pull a fast one on the 1st of April but keeping in line with what's in vogue with tech giants this season, I have refrained from doing the same; though you can always refer to my ode to this occasion from 3 years ago. This might however leave you wondering about the image accompanying this post.

Musing #69: Modern Monetary Theory


Heterodoxy can usher in amusement and stimulation in equal measure. In the context of economics, I have seen it pop up quite often at the mention of Modern Monetary Theory, to the extent that I have run out of fingers to count on. In the same breath, MMT (not this one) is equated with the license to print as much money as needed with it being a "creature of law". This makes nought sense to anyone having the faintest idea of economics and neither does the zero sum game between government spending and private saving.

Even if the theory has merits, it is often over-simplified by those pushing for its acceptance in the mainstream, which in turn makes it sound quite crazy. After quite some time, I have come across a source that does a simple and good job of discussing it. In general, I would recommend subscribing to the 'After Hours' podcast considering the fact that I ended up binge listening to it the first time I "tuned in".

Considering what happened in Zimbabwe, it will take a lot of effort to convince anyone that "unlimited printing of money by government" is a good idea, all things considered. At the same time, running fiscal deficits is a great idea, assuming there is a limit to it and that you are getting your money's worth. After all, debt makes the world go round. Running a deficit with a manageable debt-to-GDP ratio makes a lot of sense provided the government is getting a positive ROI in terms of social and economic benefits. What makes less sense is turning fiscal and monetary policies on their head without understanding their due impact.

There is no denying that the economic system is a belief and a social construct but there is no turning back the clock as well. Instead, the idea is to bring in meaningful change and MMT can contribute to that in parts, though not as a complete alternative. In the meantime, Calvinball anyone?

Musing #68: Making the most of Audible trial


Listening to a book doesn't quite have the same panache as reading one, but sometimes it is the only option. While I am an avid fan of podcasts on long commutes, sometimes audio books can fill the void quite well. However, in case of audio books, for me, it is unabridged or nothing, which depending on the author's preference for brevity can extend to numerous hours; a significant investment of time whichever way you look at it.

Musing #67: Valley of the Boom


I will admit that I am a sucker for tech docudramas and find it cringe-worthy when it goes too far off the rails. It is one of the reasons that I could never appreciate Halt and Catch Fire to the extent that a tech aficionado ought to. On the other hand, Pirates of Silicon Valley was far more watchable despite its inaccuracies. Hence, when I came to know of 'Valley of the Boom', I had to give it a go. At the time of writing, only two of the six episodes have been made available on the NatGeo website and while it is not fair to review a series in parts, it is certainly worth musing over.

Musing #66: Hey Siri Google


Siri has never been good to me. I have seldom been able to get it to do what I want it to. For a time, since its inception, it was revolutionary. However, whereas intelligence develops with time from its infancy, this hasn't quite been the case with this artificial intelligence. On the contrary, it seems to have suffered cognitive impairment over time.

Musing #65: Purchasing a projector (Xgimi Z6 Polar)


Purchasing a projector is never a straightforward decision because you are not buying in to its own product category but rather in to the category of displays as a whole. Thus, one needs to weigh it up against purchasing a TV or even a monitor. It doesn't take much to figure out that the projector is the device of choice for a home cinema on a budget, where size matters more than anything else.

Musing #64: Escalation and issue resolution with Amazon, Flipkart and other services


The holiday season implies sales and savings on the major e-commerce websites but along with it comes the headache of dealing with customer service, or lack of it, when things go wrong. This dark side doesn't pop up when making occasional purchases through the year but raises its ugly head during the sale season on account of en masse purchases, which inevitably increases the probability of things going wrong.


Musing #62: Deleting Facebook account


It shouldn't really be a thing, but I felt the need to mark the day I deleted my Facebook account. Seldom are people able to destroy (traces of) their (digital) life with complete knowledge of its consequences and hence this occasion warrants a mention.

Facebook has been in the news for all the wrong reasons for the past few years, but the latest hack was the last straw that broke the camel's back. I had already dialed up the privacy settings to "11" few years ago and the only reason for the account to exist was for acquaintances to get in touch. However, as I realised over the course of time, most people yearn for a wider, if irrelevant, audience.

As far as connectivity is concerned, my mobile number as well as my current email address predates Facebook by over half a decade so those who want to get in touch, still can. It is no coincidence that a lot of account deletion guides have popped up again but it would be best to refer to the one from Facebook itself. Going through the downloaded data is a trip down the memory lane but it is also an instant realisation of how much information Facebook has accumulated and retained over the years, despite using their highest privacy settings.

As a final relic, I have included the cover photo that I maintained at Facebook for nearly 3 years prior to deletion. I had no misapprehensions of what Facebook was all about but it was unfortunate to see the evil and greed quotient increase exponentially thereafter. As is the case for every decision that Facebook makes, account deletion is based on cost-benefit analysis. For me, the former far outweighed the latter. The cost may have always been invisible but the price paid definitely was not. It's time to move on to a brave new world without Facebook.

Musing #61: Adapting apps for Gear Fit2 (Pro)

While the original post was about the 2048 app, I feel it would be best to have a single post for all my adapted Gear Fit2 (Pro) apps. The original article is still present below for any guidance it may provide in installing the apps on the device. I will be listing the apps along with a screenshot and the link to download the *.wgt files. A short description has been included along with references to the original source/app.

1. 2048: Based on the latest source (Oct 2017) for 2048 posted on Github with suitable interface/colour modifications for Gear Fit2 Pro. Uploaded on Sep 11, 2018.

2. SciCal: Based on an app called 'Kalkulator' or 'Calculator Net 6' for the Gear S, I have renamed it to SciCal as it is a scientific calculator while adding a catchy icon from Wikimedia. The dimensions of all the "pages" of the calculator have been modified so that no scrolling is present. Unfortunately, the interface stays as it is due to the large amount of information involved. Uploaded on Sep 23, 2018.

Original Article (Sep 11, 2018):

It is no surprise that Samsung has artificially stifled the Gear Fit series for it to not steal the limelight from their flagship "S" series. Consequently, Galaxy Apps store submissions for the Gear Fit2 and Pro are only limited to watch faces with partners like Spotify being the only ones allowed to publish apps for the device.

This doesn't imply that the device itself is incapable of running third-party apps. Samsung provides the necessary tools to create, install and run applications for the Tizen platform as a whole and this benefits the Gear Fit2 devices as well. However, without a centralised distributor, it takes a lot more effort to get an app distributed and installed on the device.

The Gear Fit2 is capable of running web apps which are essentially websites stored on the device. Hence, for my first Tizen app, I decided to go with the sliding-block puzzle game 2048 which is freely available on GitHub under MIT license and presents an everlasting challenge, even on the wrist.

Apart from scaling the game to fit the 216x432 screen, I have made a couple of tweaks to the interface so as to optimise the experience for the device. The first is switching the colour scheme to darker colours to preserve battery life on the SAMOLED screen as against the default lighter colour scheme. The second tweak, apart from adjusting the font size and spacing, is to switch the 'New Game' option higher up and to the left to prevent accidental resetting of the game when swiping up, as has happened to me on more than a few occasions.

I have uploaded the 2048.wgt file, as installed on my Gear Fit2 Pro. This implies that the file is self-signed and hence will not install on any other device. Thus, you will have to sign it specifically for your device prior to installation. Detailed instructions on the same can be found on XDA. After self-signing, the app can be installed using the Tizen Studio SDK by connecting to the device using "sdb connect <ipaddress>" and then issuing the command "sdb install 2048.wgt". Details on that command can be found here.

So, test it out and let me know how you feel about it in the comments. You may also share the details of any other web applications that you would like to adapted for the Gear Fit2 devices.

Musing #60: PC Overclocking



Having grown up through the megahertz and subsequently the gigahertz war, I can only say that speed matters. Over the years, I fought to get the last ounce of performance out of the system that was "machinely" possible. This was the case until Sandy Bridge arrived. On one hand, it offered the most value for money in an eternity and on the other, set a trend where overclocking meant buying in to the most expensive processors and motherboards.

Hence, it was a practical decision at the time to go with the i5-3470, a processor with locked multiplier, along with a H77 chipset motherboard that was not meant to assist overclocking. It still offered the option to run all the cores at the turbo frequency of 3.6 GHz instead of the base frequency of 3.2 GHz and that is how it ran for nearly 6 years. It met every requirement I had of the system and a bit more so as to not be concerned about upgrading.

However, as is always the case, my hand was forced, like it was in the past when I upgraded to the GTX 1060. Only this time, I had no intention of upgrading the trio of processor, motherboard and RAM considering the inflated memory prices as well as with AMD's Zen 2 and Intel's 10nm processors around the corner. For the first time, I was left in a rather peculiar situation where I needed to change a component for a platform that has been discontinued for years.

Luckily, there is always the web that one can turn to. Scourging the tech forums for a desired motherboard is akin to hitting the lottery and sure enough I didn't luck out. Then, I decided to go with one of the B75 chipset motherboards that were still mysteriously available on Amazon, only to discover that they were OEM boards with a locked BIOS and lacking compatibility with my RAM. So, after I made the most of Amazon's gracious return policy, I decided to uptake the final resort and go ahead with the purchase of a used motherboard, admittedly with my fingers crossed, on AliExpress.

The shipment had its fair bit of drama over a period of 3 weeks but finally made its way through and was surprisingly well packaged. The absence of dust was a welcome sight, though the rusted socket screws immediately gave way to the fact that the board was used. All things considered, the motherboard was in good condition and thankfully the mounting bracket was included.


The board, an Asus P8Z77-V LX, opened up CPU overclocking opportunities in ages, albeit limited ones on account of my existing hardware. Overclocking can't be thought of in isolation as due consideration is needed to be given toheat. Intel's stock cooler is anything but the perfect foil for overclocking and hence I had to first stock up (pun intended) on an after-market cooler. For this, I again first turned to the used market and amazingly found an open box Deepcool Gammaxx 300 for INR 1200 ($17) as opposed to a new unit price of INR 2000 ($29). It isn't something on any ardent overclocker's wishlist but it gets the job done with its 3 heat pipes and a ginormous 120 mm fan.


To capture the difference that even a budget after-market cooler can make, I ran the stock cooler back-to-back with the Gammaxx 300 on the exposed motherboard. To check the stress temperatures, I simply bumped up the CPU multiplier over the default settings. Even in this setup, the Gammaxx 300 lowered the temperatures by over 20 degrees when under load while also ensuring a much lower idle temperature.


The bigger test however is ensuring lower temperatures in a constrained environment. In that sense, my cabinet (a generic old one at that) is not located in the most optimum position due to cabling constraints. Hence, I was expecting the temperatures to be much worst than they actually turned out to be. It also indicates that using the stock cooler was not even an option, unless you are looking for fried eggs and expensive paperweights.


Being out of the overclocking game for so long, I read up on the motherboard's features while the board was still in transit to fathom some of the newer terms and pretty much decided on a list of settings I would go around changing in my pursuit of performance with the lowest power consumption and heat generation. Thankfully, up until Ivy Bridge, Intel provided limited unlocked multipliers 4 bins above the maximum turbo frequency. This meant that my i5-3470 with a base multiplier of 32 and turbo multiplier of 36 was capable of being run at 40 multiplier. This doesn't imply that all 4 cores can simultaneously hit the 4 GHz mark as it is limited to 3.8 GHz by design. However, what it means is that it can certainly hit the magical 4G mark when one or two of the cores are loaded. I suppose there is some satisfaction in finally getting an old horse to learn new tricks.


Setting the multiplier at its maximum is easy and can even be done using the Auto or XMP overclock option. The difficult part is controlling the temperatures while also finding the limits of the RAM. To that end, I found the Load-Line Calibration to be an indispensable tool in tightening up the voltages and thereby lowering the offset. After much trial and error, I was able to set a stable CPU offset of -0.045V with the high (50%) LLC option which lowered the temperatures by a few more degrees and ensured next to no vDroop.

Running quad-channel RAM from different manufacturers is always a tricky proposition, even when the timings are the same. I had my initial CAS 9, DDR3-1600, 2 x 4 GB Corsair Vengeance teamed up with a similar GSkill RipjawsX set from 4 years later. This meant the job of overclocking the RAM was anything but easy and involved numerous failed boots. Eventually, I was able to get them to run stably at 1800 MHZ, CAS 10 with only a minor bump up in voltage to 1.53V. However, the impact on memory performance was not insignificant.

I suppose it makes sense to go all-in when you have entered the game. Hence, I decided to overclock my GPU as well. For over 2 years, I never overclocked the Zotac GTX 1060 Mini, being as it is, a single fan device. Size can be misleading though and the added CPU cooler certainly aids the overall air flow. It didn't take me long to figure out the memory isn't going to be up to the task, which is understandable considering it is not protected by a heat sink. In the end, I conservatively increased the memory clock by 100 MHz and the core clock by 200 MHz without touching the voltage.

A final tool available in pushing the clock even further is the base clock. Unfortunately, after setting up the overclock for all the other components, I found that the base clock increment to even 101 caused significant instability. Increasing the CPU and RAM voltage brought some modicum of stability but inexplicably reduced the performance across all benchmarks while simultaneously raising the temperature. Thus, there was no use pursuing this path any further.

The performance comparison presents of the overclocked system with the default one certainly provides some satisfaction. The XMP overclock is set to use the maximum CPU multiplier of 40 but it was unable to run the RAM at 1800 MHz at the same time. Going by the incredibly higher temperatures, it is obvious that the XMP overclock pushes the voltages a lot higher. The only upside here is that it is capable of running all the cores simultaneously at 4 GHz which produces a minuscule performance advantage. However, the manual settings are more than a match and come with a significant upshot in memory performance with much better thermals.


While the upshot in CPU and RAM performance is quite evident looking at the table, the GPU performance is not. As it happens, PCMark doesn't stress the GPU much whereas Hitman seems to be constrained by the CPU. Thus, the need of the hour was a GPU intensive benchmark which came in the form of Heaven. As can be seen in the results, the overclock results in an FPS improvement of over 8% compared to the stock speeds. At the same time, it makes sense to set a custom fan curve as it can keep the temperatures down under full load.


To round up the post, no overclock is worth its salt without a stress and torture test. The idle CPU temperature of 27 is pushed up to 63 by AIDA64's stress test and then stratospherically to 77 by Prime95's torture test. However, this is well within the processor's specifications and represents the worst possible scenario that normally doesn't manifest itself in the most taxing of daily use cases.


To conclude, this entire episode was brought about by an unforeseen failure in ageing hardware and hence the overclock exercise is strictly incidental, but the thrill of it as much as anyone would get when setting up a new system.

P.S.: If you followed my earlier post on Meltdown and Spectre, then you'd know it is something I thought of when buying the motherboard. Like with the ASRock boards, there was a helpful soul patching the unsupported Asus boards as well. However, when I went about flashing the BIOS, I found it to be incompatible due to the way it was packaged. Thankfully, Microsoft has fully patched Windows to support the latest microcodes from Intel (1F in the case of the i5-3470). It wasn't auto installed over Windows update and I had to manually install the KB4100347 patch for Spectre.

Musing #59: Waterproof Socks!


Soggy socks can be an arduous affair. It is a memento of the monsoon's dreariness that you are compelled to carry with you and there is certainly no escaping it in a sinking city with crumbling infrastructure. The fault of this plight lies in no way at nature's doorstep but rather on the nefariousness of human nature, but that musing is left for another day.

How ought then one waddle through the inland pools of blight resembling water without the fear of being infected? Bare foot spares the sogginess but at the risk of ending up on a hospital bed. Luckily, homo sapiens causeth and homo sapiens giveth, at a price of course.

The marvel of keeping water at bay while slaloming between potholes lies in getting oneself a pair of waterproof socks. Oh yes, they do exist! They have been lying in the burrows of e-commerce for years and it is for the needy one to dig it up, especially if you happen to be in a country where the product has no retail existence. Accordingly, yours truly got a pair, and a trekking one at that.

The proof lies in the pudding and I can only describe it as a miraculously feeling when squeaking boots are not in cohort with shrivelled skin. Of course, prior to testing them on the roads, I had my feet immersed in a tub full of water, only to be oblivious of the existence of the second state of matter. It is like being dissociated from reality, though not in the sense of nirvana.

Of course, all is not hunky-dory since there is no ignoring the weight of the additional layers and the heat build-up in the dry. This one is certainly best saved for a rainy day. Then, there is the small matter of the price and my fingers are not enough to count the number of ordinary pairs that I could have purchased in lieu of this.

On the whole, I can't state how much I appreciate the dryness brought forth by this item, as much as I abhor the same in a human being. Sometimes, it just the little experiences that make a high price seem totally worth it.

Musing #58: Mutual Fund (SIP) Portfolio Overlap Analyser



Being from a finance background, I made it a point to invest in SIPs early on. Over the years, while the investment amount has increased steadily, the number of funds being invested in has remained more or less constant. Hence, I need not emphasis how important it is to know where exactly the money is going.

Too often, the choice of a fund is made simply on returns and diversification is achieved by selecting a different fund class. However, it provides no indication of the extent of value creation. I prefer to keep an eye out on what's happening with my portfolio and it is not only when selecting a new fund but also for keeping tabs on what's going on with the existing investments.


My search for websites/files providing this information yielded a few options that were quite limited in nature, dispensing basic overlap information between two or three funds. Unable to find the requisite information, I decided to go on my own and create an Excel workbook that provides overlap analysis for up to six funds. The other target I had set for myself was to do so without the use of VBA, so the only permission required is to access the external data source - moneycontrol.com.

The workbook is structured in to distinct sheets for input and detailed analysis. The 'Input' sheet is pretty straightforward and is essentially a two-step process requiring the funds and investment amount to be entered along with the selection of the fund that would form the basis of checking the overlap. It would be a good idea to read through the notes prior to using the workbook. The sheet has some safeguards built in to alert the user about inconsistent inputs, like missing investment values/funds and failure to refresh the 'base fund' selection. At the same time, it is robust enough to still function immaculately when any of the selected funds are deleted.


Note that although the sheet includes funds with equity holdings from various classes, some of them do not have their holdings listed on moneycontrol.com which may cause an error illustrated above. As such, there is nothing that can be done about it. Also, to state the obvious, the default funds selected in the sheet are for illustration and are not suggestive.


The 'Analysis' sheet provides the primary analysis of the portfolio. Besides listing the fund class and the equity holdings of each fund, it provides the percentage overlap of the base fund with all the other funds in the portfolio, both, in terms of the number of stocks and the value invested. The charts in turn provide 'Top 10' visualisations for individual stocks as well as the different sectors.


The 'Detail' sheet provides the tabular information that form the basis of the analysis and lists all the values as against only the Top 10 in the charts.


The 'MFx' sheets list the holdings of each fund, as retrieved from moneycontrol.com and is subsequently used for the overlap calculations.


Finally, the 'List' sheet is a list of the funds retrieved from moneycontrol.com and covers the various equity fund classes. It is easy to add any new funds to the list in the specified format and the information can be scraped en masse from the MoneyControl site.

As is often the case, I have created something to primarily fulfil my needs but with the intention of sharing it with other netizens. Consequently, I am open to any suggestions for improvement which you may leave in the comments section.

Link: Download from Google Drive