Information technology continues to change rapidly, with new areas of development emerging frequently. This post is based on a list of terms that was developed by staff of WatSpeed, on which they sought greater understanding. Each term is briefly described and accompanied by a short video that provides a fuller explanation:
Data Science
Information technology has greatly increased the data that is available to most organisations, from both internal and external sources. Data is available on operations, customers, employees, suppliers and many more aspects of activity. Data science helps organize and extract knowledge from the data that can help the organization make better decisions in many areas, including development of new products, improvements to processes and management of people.
Data Analytics
Data analytics is the work that is done to understand the extensive amount of data that is available in most organisations today. It includes the tools and techniques that are used in the analysis, such as mathematical processing to identify trends and predict future market behaviour. Data analytics and data science are terms that are used interchangeably, though data science may more often describe the broad data focused activity and data analytics the tools and techniques for knowledge extraction.
Artificial Intelligence
Artificial intelligence (or AI) is computer technology that is able to perform tasks that normally would require the human brain. This includes the ability to learn and to solve problems. Artificial intelligence is used today in many areas including healthcare (to understand x-rays and scans and design treatment) and in self driving cars.
Artificial General Intelligence is the term that describes human level artificial intelligence – where computers can think like humans. Some have argued that artificial intelligence may advance to a level where computers are "smarter" than humans, with the potential for negative human consequences. Most commentators believe this is unlikely.
Neural Networks is an approach to computer design that is based on the technical design of the human brain, with various sections that do different forms of processing, which are then combined to produce an intelligent output
Deep Learning is the ability for computers to learn from large amounts of unstructured or unorganized data, for example, images, video, social media posts, etc. The computer is able to review the data and produce a valuable result without human intervention, such as reviewing legal documents and suggesting strategies for lawyers.
Blockchain
Blockchain is an internet- based way of securely recording and sharing information, that operates without an overall manager or central control. It does this by a 'distributed ledger' that holds the records on many computers, making it harder to steal or falsify them. Blockchain has many applications, such as land ownership records, academic records and in the supply chain to ensure accuracy in records of product origins (such as food).
Blockchain is the technology that is used for cryptocurrencies, the most well known example of which is Bitcoin. Bitcoins are purchased with traditional currency and ownership of these is recorded and exchanged on the Blockchain. Cryptocurrencies have many problems, such as the huge changes that we have seen in their value, their use by criminals (because they can be held anonymously) and due to the large amount of electricity needed to mine them. Bitcoin mining is the process which creates new Bitcoins and maintains the Bitcoin Blockchain.
Quantum Computing
Super fast computer technology that would be able to undertake tasks that require a large amount of data processing, very quickly, enabling more advanced artificial intelligence which can be beneficial in many areas, such as far better weather forecasting – perhaps years in advance. Quantum computing is at an early stage in its development and requires very carefully controlled conditions to be able to operate – extermely low temperatures, absence of vibration and noise.
There are significant questions over the value that quantum computing will provide in the future. Some argue that the challenges its development faces will be impossible to overcome while others believe it will have a revolutionary impact within the next several years.
5G
5G is the latest mobile connectivity technology. Most mobile phones use 4G today - 5G is much faster than 4G and will allow for faster streaming of video and games, remote control of equipment from mobile devices, etc. 5G requires dedicated phones (you can't use your existing phone with 5G) and a new network infrastructure with transmitters that need to be very close to each other.
Nanotechnology
Nanotechnology is making things that are very, very small and/or using materials that are very small scale. Nanotechnology s used in many different areas, such as in computer equipment, in medicine and in food.
Cyber Risk
The increased use of technology in the world has increased the vulnerability that exists to failure of that technology, either because of a technical failure in the technology itself or because a human behaves deliberately or undeliberately in way that causes a negative result. We have seen the consequences of this in cyber extortion, where access is prevented to computer systems in exchange for a ransom, where data is stolen or exposed, where systems fail, preventing work being done, etc. Cyber risk management is undertaken to identify and control this risk, either eliminating, reducing or preparing for its impact, if it occurs.
This post has provided an introduction to a range of topics in information technology today. It will be used as the basis for discussion to create better understanding of their application.
No comments:
Post a Comment