Tuesday, December 24, 2019

DevOps in Legacy Systems

I had a discussion with one of my then-manager-colleague about ensuring the movement of a planned item even if the only available team member has no expertise to take it on. The simple answer is to get that team member to a starting point with the help of the experienced ones, through a non-heavy pair work format, just enough to give them momentum. It is possible that the task won't be completed on time, but at least they got it moving.

In practice this is always easier said than done since we always fall to the mindset of either: "It is going to be faster if the expert will do it" or "I'll just take this non priority item which I have expertise on rather than mess up that very critical one". This somewhat silo mentality is especially prevalent in legacy systems, adding to the hurdles on its already challenging DevOps adoption.

In attempts to assume DevOps practices, most of the failures are caused by the teams' resistance to change because of the leaders oversimplifying the problem. That is why at the initial stages of DevOps assessment, dialogue between leaders and legacy teams is required so that visions and goals are personally communicated, uncertainties are addressed, and negotiations and agreements are formulated, which in turn will help earn the people's confidence and acceptance towards it. DevOps is a culture after all. It is also worth noting that a DevOps team/person is not necessary where creating one turned out to be problematic for some because the bottleneck just got transferred to another entity.

An attitude of shared responsibility is an aspect of DevOps culture that encourages closer collaboration. It’s easy for a development team to become disinterested in the operation and maintenance of a system if it is handed over to another team to look after. If a development team shares the responsibility of looking after a system over the course of its lifetime, they are able to share the operations staff’s pain and so identify ways to simplify deployment and maintenance (e.g. by automating deployments and improving logging).  - Martin Fowler
Here are some of what needs to be resolved and considered when deliberating on adopting DevOps:

Architecture and technology audit. In DevOps POV, it is important to include: list of supported systems (document declared and client-deployed), SCM branch strategy and end-to-end delivery process flow definition. In a decade old legacy systems, small automated processes and tools are most likely to be existing already and should also be accounted.

Assessment of your system against automation tool-sets, taking note of benchmark and baseline data to help define realistic targets for the metrics. And speaking of metrics, selecting the right kind that can be translated in business value terms and is understood by all is key to continuous improvement. Examples: deployment speed, failure rates, delivery frequency, bug recurrence rate, etc.

Interface agreements. When teams are at their exploratory work, be it in automation or trying out new tools, establishing standards on how they will send their data across the entire delivery process will promote parallel development and help in traceability. Most importantly, these data interfaces will protect their autonomy where implementation is being left to their discretion, as long as the ending output adheres to the standard format that is compatible to the next phases of the pipeline. Given that, a part of these interface agreements must include counter-checking mechanisms.
But, whether a DevOps initiative will push through or not, it is always beneficial for an organization maintaining a legacy system to already start investing on the following:
  1. Standards and best practices.
  2. Unit test coverage including codes involved in operations, to get the earliest feedback at the most fundamental level.
  3. Automation and development of specialized tooling that will cater to the system's automation needs.
  4. Technical debt repayment.
  5. Exploring the possibility of cloud and containerized solutions.
Porting legacy systems to DevOps is especially hard because of the associated practices and attitude that came with it. Just like in any acclimatization process, small adaptations with demonstrative progress is more effective rather than having a strict top-down approach. People are inherently motivated if the progress is a result of their own volition to the improvement of self or otherwise. 100% in full DevOps may not be achieved, but at least there will be a motivation to get it moving.

----------------------------------------------------------------------------------------------------------
This article was originally published in LinkedIn

Monday, December 16, 2019

Quantum Computing: Drivers of the Hype

Last October, I attended PSIA's SoftCon.ph 2019, where one of the plenary hall speakers is a Managing Director from Accenture who discussed a new set of emerging technologies following the SMAC (Social, Mobile, Analytics, Cloud) technologies from 6 years ago. Abbreviated as the DARQ Power which stands for Distributed Ledgers, AI, Reality Extended, and Quantum Computing.

The first time I heard of a Quantum Computer from BBC's 2013 documentary, "Defeating the Hackers", was with a raised eyebrow due to the fact that Quantum Mechanics, from how I recall it in college around 10 years ago, defies the reality as we know it. And upon pursuing this topic now, this branch of physics is still regarded as 'weird' even by physicists themselves.

Despite the uncertainties and being at an early phase of development, people are heeding and already considered Quantum Computing/Computer (which I will refer to as QC from here on) as if it is the only way to move forward.

'Quantum Supremacy'

Just last October of this year, Google announced claiming that they have achieved 'quantum supremacy' with their QC system named Sycamore. The task involved, somewhat like a 'hello world' program for QCs, is just a random set of instructions which they estimated will take 10,000 years for the world's fastest supercomputer to complete. Sycamore made it in 3.33 seconds.

At this point, there are no practical applications yet but they compared the feat to the first successful flight of the Wright brothers. It is the beginning of something that surely could become very significant.

The Early Buy-ins

Even before Google's announcement, Accenture had already made their move. Last July, they acquired a patent for a Machine Learning module that would help business decision-makers determine if QC would be beneficial to them.

Accenture, starting in 2015, already did their research work on the feasibility of QC as a business solution. In 2017, partnering with 1QBit Information Technologies Inc., they were able to map out more than 150 use cases for the technology. Most notably out of these is for speeding up drug discovery.

With QC's power becoming more evident, more companies and investors are taking interest, including the Trump administration, whose support is mainly coming from cybersecurity reasons. The US government made Quantum research development a priority.

 The Ultimate Driver

Obviously, most of QC's potential applications are coming from the areas where supercomputers are having difficulties still. Molecular simulations, (which is important for chemistry research) and forecasting (for meteorology, financials, logistics etc.) are exponentially complex where classical computers are generally unable to overcome, except through approximations. Although, there could still be ways to improve our current supercomputers, we are nearly approaching the physical limits. Expanding the infrastructure will eventually become impractical unless we make every component smaller to maximize the use of space.

But here is the problem...
The heart of a supercomputer is its CPUs (IBM's Summit, the world's fastest supercomputer has 200,000 of them) and the "building-blocks" of a CPU are its transistors. Today, transistors are around 10-20 nanometers (the most recently launched AMD Zen 2 is at 7nm), and we are still going for 5, even 3nm, on the next couple of years, but as transistors get smaller, it will reach a point where it will experience quantum tunnelling. We are nearing the end of Moore's law.

If one is to imagine a transistor being a switch where it has an 'on' and 'off' state, hence how a computer communicates through the binary language of 1s and 0s respectively. At an 'off' state, there is a barrier that stops the flow of electrons, but in quantum tunnelling, the electrons are now able to get to the other side regardless of that barrier, likening to a ghost that can go through a wall. This means, the transistor will cease to function as intended.

In closing...

Still, we do not expect QC to replace classical computers any moment now or even in the next couple of decades. There is still a long way down the road, and it is reserved to specific purposes especially given its expensive setup. It is fundamentally different against classical systems, like how pencil is to a pen.

----------------------------------------------------------------------------------------------------------
This article was originally published in LinkedIn

Saturday, December 14, 2019

My notes on setting up PyCharm with Gerrit

PyCharm Notes1. Install Pycharm
2. Install Git
3. Install Python interpreter
4. Install required Python modules

In Pycharm
1.    Clone the projects
2.    For multiple projects in one window – do an ‘attach’
3.    Configure Python interpreter per project
5.    install packages/plugins  via Settings - e.g. Gerrit plugin
6.    Set the project dependencies in Settings - e.g. Project1 and Project2 projects are dependent to COMMON
7.    Set COMMON project as a source directory

Gerrit - new change
1. Pull to latest
2. In committing: Supply 'Author'
3. In Pushing: make sure 'Push to Gerrit' is checked
4. Check that a new Gerrit record has been created by refreshing the Gerrit pane below.

Gerrit - amending
5. Pull to latest. The Gerrit pane should also refresh.
6. Checkout the change (right-click - checkout)
7. In committing: Make sure of the following
  - Commit Message, "Change-Id: I34e32........" line exists
  - Author is supplied
  - "Amend commit" is checked
8. In pushing: by default all Projects are checked, uncheck project that has no changes
  - Make sure "Push to Gerrit" is checked