It is still unclear to many what artificial intelligence actually is in concrete terms. At the same time, it is and has often been discussed whether artificial intelligence poses a threat to humanity, especially after this topic is often associated with a scenario like in the Terminator series. But fundamentally, this technology is neither difficult to grasp nor an inherently dangerous one. As is so often the case, the decisive factor is in which context it is used.
Artificial intelligence, in the context of the capability repertoire, can be described as a nested box system. In the outermost, themain box “Artificial Intelligence” are concepts that enable a machine to make decisions in a primitive way, where already this level of decision making has great overlap with the human one. An example of this is the game “Tic-Tac-Toe”: Due to the symmetric arrangement of the board and the time irreversibility (no retraction) given by the game, a simple decision tree can be implemented with little code to ensure at least a draw, but in most cases a win.
In addition to e.g. the concept of the decision tree, there is again aBox on this nesting level, which combines concepts under the term machine learning. These concepts use data sets or statistics that are as extensive as possible and therefore as meaningful as possible, from which forecasts are made and highly probable forms of behavior can be derived in the future. An example for this is a game of chess: Due to the infinitely large number of moves, a decision tree would grow just as infinitely large, would thus no longer be feasible. From a data set containing as many piece distributions on a chess board as possible, moves from the past as well as the result of the game, moves can be extracted which are more likely to lead to a victory. A selection of these previously successful moves together with their implementation in the current game thus increases the probability of victory.
However, within the box of Machine Learning, besides the mathematical or statistical methods described above, there is another box that needs to be explained to get a complete understanding of Artificial Intelligence. In the Box of Deep Learning are methods that combine the above methods, datasets and their creation with algorithms. With the use of multiple layers of these decision networks and algorithms, such as artificial neural networks, complex tasks can be solved, such as recognizing objects in images, exchanging faces in movies, or even in real time during a video call.
Back to the actual topic. The term AIOps is short for “Artificial Intelligence Operations” or “Algorithmic IT Operations“. The term was first used by Gartner in 2016 to describe a consistent sequence in IT or DevOps operations in which artificial intelligence is used for improvement. A probabilistic approach using recorded data is used to make predictions and trigger automated recommendations for action. In IT, several tasks have to be fulfilled. Software and hardware must be maintained. Users and customers claim changes as well as continuous development, which entails risks. In addition, there is always the risk of attacks from the outside. Used correctly, AIOps can result in administrators having time for important tasks or personnel costs being saved.
In order to face and successfully master the ever-increasing challenges in IT, a transformation to AIOps can take place in about five steps:
- Reactive and meaningful evaluation of high-frequency monitoring messages.
- Proactive use of the evaluated data sets for error and root cause identification in order to find existing errors that have not yet become apparent and to correct them before they occur.
- Prognostic use of data sets to predict errors in the future.
- Prescriptive recommendation of a solutions for the errors found or predicted.
- Automated application of solutions to the found error causes or preventive steps for predicted errors.
Of course, many now think that AIOps is just another of the many hype topics in the IT world that will soon die down. In fact, AIOps has been implemented for a long time, there is even an existing example which is known from television: