terprise Architecture has always had a rocky existence in Asia (more so than any other region). While most people I spoke to acknowledge some need for it, it is usually the 1st group to be cut when the economy doesn’t go so well.
When I look back at my experiences, I think it is because many “Architects” have evolved from engineers and solution designers and have some trouble making the leap to “architecture”.
Therefore I decided to see how I can help in my own way. There are many books on modelling techniques, framework implementation and even certification guides, so i shall not duplicate those information. I decided to go back to basics and write about an overview of EA, how it got to where it is, cover some important frameworks, and most importantly what are the areas of architecture that make up a complete EA view and help folks starting out building up EA teams to understand the various perspective of EA.
I also cover some important concepts on architectural thinking (moving from engineering to architecture ) and how to get started on architecture documentation, which I find pretty lacking these days.
So, if you know anyone looking to understand the structure of EA, the basic concepts and how to document architecture decisions, do point them to my latest book.
Link to my book on Amazon here.
In an era where technology underpins nearly every aspect of modern life, the role of IT architects and engineers has become increasingly critical. These professionals design, implement, and maintain the complex systems that power our businesses, governments, healthcare, and personal lives. Given the profound impact of their work, I believe it is time that IT architects and engineers are certified and licensed.
Building architects and engineers have long been required to hold licenses to practice their craft due to the critical importance of ensuring structural safety. The collapse of any building or infrastructure can have catastrophic impacts on public safety, making it essential that only qualified and licensed professionals are entrusted with these responsibilities.
Today, the same can be said for IT systems that affect lifes (hospitals, airports, utilities management systems, etc). I think the industry needs to start thinking about qualifications of IT professionals designing and implementing such critical IT infrastructures.
Here are several compelling reasons why certification and licensing should be a standard requirement in the IT profession:
1. Ensuring Competence and Professionalism: Certification and licensing serve as formal recognition of an individual’s knowledge, skills, and competence in their field. By requiring IT architects and engineers to undergo rigorous examinations and continuous education, certification ensures that they possess the necessary expertise to perform their duties effectively. This process helps maintain high standards of professionalism and ensures that individuals are up-to-date with the latest technological advancements and best practices.
2. Enhancing Security and Risk Management: The systems designed and maintained by IT professionals are often critical to the functioning of organizations and society at large. A single error or oversight can lead to significant security breaches, data loss, or system failures. Certified and licensed IT professionals are more likely to be aware of and adhere to stringent security protocols and risk management practices. This reduces the likelihood of costly and potentially catastrophic incidents.
3. Promoting Accountability and Ethical Standards: Licensing IT architects and engineers can help establish a framework of accountability. Licensed professionals are bound by a code of ethics and professional conduct, which promotes integrity, responsibility, and ethical behavior. This accountability ensures that IT professionals prioritize the public good and the interests of their clients, fostering trust and confidence in their work.
4. Standardizing Skills and Knowledge Across the Industry: Certification and licensing create a standardized benchmark for skills and knowledge within the IT industry. This standardization benefits employers, clients, and the professionals themselves. Employers can be confident that certified and licensed individuals meet a consistent level of expertise, reducing the risk associated with hiring and project execution. For IT professionals, certification and licensing can facilitate career mobility and recognition across different regions and industries.
5. Addressing the Increasing Complexity of IT Systems: As technology continues to evolve rapidly, the complexity of IT systems grows accordingly. IT architects and engineers must navigate a constantly changing landscape of technologies, frameworks, and methodologies. Certification programs ensure that professionals are continually updating their knowledge and skills, enabling them to effectively manage the complexity and drive innovation within their organizations.
6. Improving Quality and Efficiency of IT Projects: Certified and licensed IT professionals are more likely to follow industry standards and best practices, leading to higher quality and more efficient project outcomes. By ensuring that IT architects and engineers are well-trained and competent, organizations can reduce project delays, cost overruns, and failures, resulting in more successful IT initiatives and better return on investment.
7. Protecting Public Safety and Welfare: In many sectors, IT systems directly impact public safety and welfare. For example, healthcare systems, transportation networks, and critical infrastructure all rely heavily on sophisticated IT solutions. The certification and licensing of IT professionals in these areas ensure that only qualified individuals are entrusted with the responsibility of designing and maintaining these crucial systems, thereby protecting public safety and welfare.
The certification and licensing of IT architects and engineers are not just beneficial but essential in today’s technology-driven world. These measures ensure that IT professionals are competent, ethical, and accountable, thereby enhancing the overall quality, security, and reliability of IT systems. As technology continues to advance and integrate further into all aspects of society, the importance of certification and licensing will only grow, ultimately safeguarding the interests of businesses, governments, and the public.
DataOps, a portmanteau of "data" and "operations," is a set of practices, principles, and tools aimed at streamlining and automating data operations, ensuring the rapid and efficient delivery of high-quality data to end-users. It draws inspiration from DevOps, which emphasises collaboration between development and IT operations to accelerate software delivery and improve its quality. DataOps extends this collaboration concept to data-related processes. It involves cross-functional teams of data engineers, data scientists, data analysts, and other stakeholders working together to enhance data pipelines, reduce latency, and ensure data reliability.
As I am building up my data engineering team, I am adopting many of the things I have learned from building DevSecOps teams. Learning from application development and support, what matters these days are "Speed and Agility". With DataOps, just like DevsecOps, I can build or change data pipelines to respond quickly to changing business needs and market demands. By automating data pipelines and reducing manual interventions, data can flow seamlessly from source to destination, allowing for faster decision-making.
Unique to data, is the concept of "Data Quality". High-quality data is crucial for accurate analytics and business insights. DataOps practices can ensure that data is cleansed, transformed, and validated before it reaches its intended destination, this is not unlike automated code testing in DevSecOps structures.
Next we have "Collaboration". DataOps fosters collaboration between different teams within an organisation. Cross-functional teams can work together to define data requirements, build data pipelines, and address data-related issues promptly. This is probably harder than DevSecOps team as many users of data are not familiar with the IT world and would need a lot of help to onboard into the DataOps teams.
Lastly, we have the concept of "Risk Mitigation". By automating data operations and implementing robust testing and monitoring processes, DataOps helps mitigate the risk of data breaches, errors, and downtime. Prior to the implementation of DataOps, these functions were very ad-hoc in nature and can cause significant delays in decision making.
To successfully implement a DataOps team, there were a few learnings:
The hardest part was putting together a robust Data governance structure and framework. Many organisation are not mature in this area and it would take time to identify the right resources, educate the teams, and enforce the discipline needed to govern the data lifecycle.
DataOps represents a transformative approach to data management, aligning data operations with the principles of speed, quality, collaboration, and automation. As organisations continue to grapple with the challenges of managing and leveraging their data effectively, embracing DataOps can be a game-changer. By streamlining data operations and fostering collaboration among diverse teams, DataOps paves the way for data-driven decision-making, enabling businesses to stay agile and competitive in today's data-driven world.
There is a lot to learn from history. Cyber warfare in many ways reflect old fashion physical warfare, so one of the things I tend to look at is how we can learn from previous generations in this aspect. In this post, I took a look at the evolution of naval warships and how they have evolved over the years and got to today's modern design.
Instead of writing a long text, I made a video to show some of these points and how ships today are built with an "assume breach" mindset and some of the lessons learnt in the implementation.
In today's world, we are getting more and more connected and thus many things in our life are made up of complex systems. When we look at how complex these systems of systems are, we can easily see that the way we have been taught to "isolate" problems and work on it, sometimes can have outcomes we did not want.
I have been actively teaching system thinking classes and I wonder why this is not more widespread in our education systems. Simple tools like the causal loop diagramming technique and the iceberg models are great tools to equip everyone with. We need to educate more people to look beyond the obvious and go deep when troubleshooting problems. I have added a short video i made on the iceberg model which I find simple and easy to adopt.
Since 2018, I have been creating classes on Quantum Computing. Started out with basic IBM Q Composer workshop (drag and drop gates visually) to the current 2-day class I conduct with Dr Lua Ruiping (my partner in crime :D ). The can person class has a lot of labs and examples we build using Qiskit on IBM Q and explores various aspect of quantum computing application like codify Shor's algorithm to crack encryptions using factors of 2 prime numbers (e.g. RSA).
Today I decided to see if I can make the class into short videos and here is chapter 1 in the new series:
As we get deeper into the architecture series, I would like to take the opportunity to cover a bit more on the theoretical aspects on why we model and why we need to follow methods or framework. I put it similar to learning how to drive - you need to learn the highway code for every country (or environment) you are going to drive in, the local laws and also maps. Those are the models of the environment (or system).
How you drove when you were taking your driving test and how you drive today would certainly be very different. When you were taking you test, you went through a "checklist" of behaviour like looking at all the mirrors, making sure your indicators are on, confirming your blind spots by turning your head, etc. before you change lanes. but today, based on your experience, you might do it all at once or take short cuts or rely on blindspot sensors in your new car.
The point is, you might not follow the "checklist" of learning how to drive, but you certainly remember the safety aspects and why you do it, although might be more subconsciously.
hope you enjoy this session.
Following on to my earlier post on IT Architecture, this is a follow up post on the types of architecture in IT and their relationships.
This is only 1 simplified view, there are many types of architecture that are not reflected here.
As I try to find the most common types of architecture to illustrate the eco-system, so remember that context is important, different industries and domains might require specialised sub-domains.
I hope those embarking on the journey of learning will find this useful.
Recently I gave a talk on the topic of evolving threats in the manufacturing and retails industries. The main focus is impact on the rise of IoT and industry 4.0 and the impact on cybersecurity. Most cybersecurity measures are focused on IT systems, but with a lot of automation and autonomous robots, we need to relook at our cybersecurity measures. here is a link to the youtube video I created:
for Cantonese version please click here
As we mature as an industry, we need to develop competency frameworks for the various roles in cybersecurity. I have started to develop a framework suited for a low to mid level end-user environment. You can find my framework here: https://ianloeacademy.com/wiki...
Recently I have ben asked to give a presentation on new threat in the retail and manufacturing sectors. While preparing the material, I noticed many organisations are not ready for the fast swift in IT-enabled operations. With the increased prominence of AI type solutions, there is a need to collect even more data to help the AI engines learn and make decisions. In the very physical world of retail and manufacturing, these data are usually collected from physical machines (either natively with newer models or with add-on sensors). To complicate matters, ESG is getting a lot of traction and energy efficiency is on top of many organisation's mind.
With IoT, there are a few key things that was not previously significant in the traditional cyber defence world.
Firstly, as opposed to data exfiltration threat (data breaches, etc), AI/ML solutions are more at risk for data injection threats. These solutions "learn" from data collected, so if the threat actor managed to inject bad data into the environment, the resulting decision models from these AL/ML solutions can be compromised.
Second, the increased use of non IP communication. Most traditional solutions were geared towards the knowledge worker and focused on IP traffic and common data protocols like HTTP/HTTPS, etc. With the increase in machine communication needing some form of threat protection, new solutions are needed to inspect these traffic.
I am developing a set of guide for CIOs/CISOs in this area and would be sharing them later, once I have validated it with some of my industry colleagues.
Everyday I hear about the "problem" - we have a shortage of tech talent in Singapore to drive innovation and accelerate digital transformation and to make the country a tech hotspot. We have many many committees and working group addressing the problem, but all i hear is build more schools! more training! more academy! and promote mid-career change!
I have my own hypothesis of this "problem". I seriously do not think it is the lack of training facilities or programmes. I have had the privilege of working in the US, both in the bay Area (silicon Valley) and in New York City (on Wall Street) for 10 years and another 15 in Asia. What i see is not a problem of access to education or training or money (although that is a big factor).
I saw many bright young engineers heading to Silicon Valley even though their take-home pay is probably higher in Singapore after taxes (as Singapore has one of the lowest tax rates in the world) and I did not see as much tech people wanting to go to New York (may be changing now). While in the Bay Area, I see many in house tech talents even in non-IT companies while in New York I see much of the IT services outsources to the likes of IBM, TCS, InfoSys, etc.
so here is my hypothesis - people are attracted to "hope". there is a sense of hope that staying in tech they can rise to senior levels in the organisation and still be a hard core techie, loving what they do. While in Singapore (and to some degree even places like New York City) tech people tend to "jump off" the technical career track to further their career. There is lack of "hope" that a technical path will get then to place they want to be (buying a house, buying a car, going for nice holidays, kids education). Some how, in the local context, to succeed, you need to move to "management", you need to take care of people management, do performance appraisal, budgeting exercise, business plans, etc.
What i see is many mid career tech folks taking MBA and CFA classes in the hopes of jumping off the tech path for a more secure future. So to me, the problem is not the beginning of the talent pipeline - we have access to the same resources someone in the Bay Area have in terms of tech education. No programmer take a formal class to learn a new language, they just go online to pick it up! The challenge is keeping people motivated to stay the path. if this goes on, we can have lots of tech workers but only with 5-6 years of experience. so the best we can hope for is to be good at tech but not great. this is reflected the lack of investment in research labs that allows people to stay on the technical path and reach new heights in their tech career and contribute their experience to the technology eco-system.
Let's all work toward building "hope" and see if we as leaders can create real technical career paths for the future generation and recover from this massive self-inflected wound.
Recently, I had many folks coming to me with questions around Zero-Trust Architecture. The most common of which is "How do i get started?"
NIST Special Publication (SP) 800-207, Zero Trust Architecture defines zero trust as a collection of concepts and ideas designed to reduce the uncertainty in enforcing accurate, per-request access decisions in information systems and services in the face of a network viewed as compromised.
What this means is "right level of access to right credentials". To really get started on enforcing access rights, the most fundamental piece of the puzzle is a way to accurately determine the requestor's identity. Identity here is used loosely to means either person, devices, application or resources.
For user type implementation, we need accurately determine who the user is, what device are they using, potentially where there are accessing data from (based on credentials of network used) and map via policy to the system/data they are requesting. For backend services, we need to know the credentials of the requesting application or microservice, which network segment the request is from and potentially other information like time of day, request frequency, etc.
So to start with ZTA, to me the most important first step is to really have a clean inventory of credentials, which means the logical starting point is a good Identity and Access Management (IAM) system.
Recently I spoke at an event for Nutanix and am encouraged by the interest in the building of Private Clouds. I started on this journey in 2015 mainly focusing on IaaS, but as the technologies improved, so has the evolution of what we can do in a private cloud environment.
A few of the really key benefits is to be able to shift workloads seamlessly between the public cloud and the private cloud. Key enablers for this to happen is the implementation fo security groups and the ability to deploy Kubernetes workload. Sure we could do this with layers like OpenShift or Tanzu, but I think Nutanix Karbon gives us some nice Kubernetes management with minimal overheads if you are running a Nutanix HCI. The same goes for Nutanix Flow, which help with the implementation of security groups so that we could duplicate our security structure without the traditional way of configuring firewall rules for on-prem deployment.
Recently, I am also exploring Nutanix Xi Leap which simplifies backup and recovery (although not available in Asia yet). With all these advancements, I am hopefully for the future of more hybrid clouds in the enterprise.
If you are curious about these technologies, you can take a test drive here -> https://www.nutanix.com/one-platform
Hybrid cloud is here to stay. So what is a hybrid cloud? A hybrid cloud refers to a mixed computing, storage, and services environment made up of on-premises infrastructure, private cloud services, and a public cloud—such as Amazon Web Services (AWS) or Google Cloud Platform (GCP)—with orchestration among the various platforms. Using a combination of public clouds, on-premises computing, and private clouds in your datacenter means that you have a hybrid cloud infrastructure.
Many organisation have started to realised that going full public cloud might not be the most cost effective route especially if you have a steady workload that requires minimal elasticity or burst capabilities.
What has changed to make hybrid or private cloud more attractive? Well, in recent years, some of the Hyper Converged Infrastructure (HCI) players like Nutanix have introduced capabilities that have long been exclusive to the public cloud like Security Groups and native Kubernetes deployment (without the need to set up a host operating system).
And with more and more standards being developed, I think many organisations can now start to be more fluid in the movement of their workloads and this in turn drove the adoption of more private cloud/hybrid cloud approach.
I will be speaking on this topic at the next Nutanix Economic Advantage Summit at Marina Bay Sands on 20 April 2021. See you there!
On 31 March I was invited to be on a panel hosted by the Cloud Security Alliance to talk about Cloud security in the age of the hybrid clod. I spoke on some of the key changes to the environment and things to look out for from a technology angle. My fellow panelist also spoke on their experiences and it was a good session to see the different perspective.
You can watch the session here : https://www.brighttalk.com/webcast/10415/468936
Had a short interview on CNA Radio to answer questions on the Singtel Data Breach.
Link can be found here:Asia First – SingTel server hack: How did it happen
Was recently featured in Enterprise Security Magazine on the topic of moving from EDR to MDR. As we are hit with the ever increasing volume of incidents and the shortage of trained professionals to handle them, we need to leverage partnerships to deal with these issues. One of the ways I am addressing this, is to make use of Managed Detection & Response services with a vendor to help react faster and gives us time to mitigate the issues.
Here is a link to the article : https://managed-security-services-apac.enterprisesecuritymag.com/cxoinsight/from-edr-to-mdr-the-security-industry-gets-serious-about-visibility-nid-1480-cid-83.html
Ian Loe