Computer science is the study of how computers are used and designed. This subject involves a lot of different elements and approaches related to theory, engineering and experimentation. In addition, it includes the exploration of algorithms that store, process, and communicate digital information. These theories and computations have helped solved everyday challenges for millions of people. However, the field of computer science is always evolving. There are new discoveries or variations of existing principles that come to light on a daily basis. Here are just a few of the most important current events in computer science.
Table of Contents
Current Events in Computer Science: Understanding the Weather
Changing weather patterns make predicting conditions difficult. The growing uncertainty surrounding climate change is already showing its head in Spring and Summer weather across the United States. As severe weather grows stronger, so does the need to know when its coming. Meteorologists study hard to learn all they can about the weather, but they have some big help. Computers and systems that monitor the weather assist professionals all over the world keep an eye on incoming storms. It is not just storms either. Temperature changes dictate farming practices, outdoor events, construction projects, and more. Overall, the weather is a universal variable in the lives of millions.
The wild thing about weather is the sheer amount of systems that come and go in any given day. A team of meteorologists cannot watch the weather all day and keep up with every system. That’s why a new movement into AI is coming from computer scientists. In order to accurately predict which systems may pose a threat to a given area, artificial intelligence is now doing that job for us. How does this differ from weather models and predictive software? It actually makes it more effective. Combine the two, and we can likely keep up with anything coming our way.
AI Assistants in Weather Prediction
The most common weather software and module is AccuWeather. This software utilizes tools all over the globe to monitor incoming systems. The key is to obtain as much data as possible. With heaps of information being collected, accurate predictions are more likely. With that in mind, the data can get overwhelming. For example, comma shaped clouds point to severe weather threats more often than not. Instead of keeping up with every comma shaped cloud on the module, AI is trained to do the job for meteorologists. After testing and experiments, AI built for weather monitoring is able to identify severe weather with 99% accuracy.
The process is like teaching a child how to read. The more they see the words, the more they know what they are. By showing AI systems 50,000 images of severe weather readings, it can now identify the systems we need to watch most closely. The modules are already there, but now, they can be read more accurately as AI alerts professionals of looming threats. It saves time, and gives us a better idea of which systems are worth a second look, and which are likely going to just move clouds through the area. Computer science makes it possible, and everyone benefits.
Current Events in Computer Science: Deep Learning
AI applications powered by neural networks are already assisting research in various fields, from climatology to particle physics. Last year’s Supercomputing show in Dallas, Texas demonstrated some of these applications. Many companies and experts in the field showed how neural network-powered AI can lead to major shifts in how we think about and conduct business. Both scientists and AI enthusiasts have taken a renewed interest in supercomputers, with companies like NVIDIA taking the first steps in this deep learning revolution.
Supercomputers have also helped build enthusiasm and encourage more funding for research. This is largely due to the fact that deep learning has changed how PC researchers plan and manufacture the hardware. GPU producer NVIDIA is at the forefront of this computing revolution. The company used the SC’18 to show the scope and potential of its work.
Current Events in Computer Science: HPC Defense Capability
The Air Force Research Laboratory (AFRL) has been hard at work, having just unveiled 4 new computer clusters. According to the official announcement, these computer clusters are capable of providing a network with which the Department of Defense can share confidential and critical information over a single computer system. The largest of the 4 computers is called “Mustang,” a reference to an old World War II era warplane. This computer cost approximately $15 million to produce, and has 2,352 Intel Platinum 8168 CPUs, as well as 24 Nvidia Tesla P100 GPUs. For the uninitiated, that makes for one extremely powerful computer.
Mustang, along with the 4 other computers (also named after various aircrafts) will form the Department of Defense’s first supercomputing center. It’s an incredible first for the Department and its peers. The major threat posing problems for major Defense projects is simply having the means to compute at this high level. By allowing this cluster to do the work, rather than setting up temporary networks for each project, everything from money to manpower.
Working Harder, and Smarter
AFRL is very much a leader in the realm of producing high class capabilities for Federal departments. The organization’s commitment to advancing computational tools means better work for everyone without the added resources needed. The ability to harness the full potential of the supercomputers ensures that important, often confidential work, is being done accurately as well as within a secure network. Programs obtain access through this facility without having to front the cost of a whole system themselves. The issue with a lot of projects is the fact that they don’t all last too long. Rather than spending hundreds of thousands of dollars for a few months of work, this 4 computer cluster allow access to all the capabilities of a huge government resource that many departments can’t front themselves.
The DoD sees this supercomputing system as a necessary step for moving the nation’s defense system into the 21st century, and beyond. As it currently stands, the supercomputing network links multiple government agencies, providing them with previously unseen computing power. However, it is still unclear exactly how this system will be implemented across the entire US government. In short, the growing capabilities of governmental departments and organizations hinges on the progress found in computer science research. This means more jobs in the public sector, offering professionals a choice in where they work.
Current Events in Computer Science: Signal Processing and Artificial Intelligence
The Packard Fellowship for Science and Engineering is both a prestigious award and a highly sought after asset for many young researchers and engineers. The award provides scientists and engineers with enough funding to continue their work, and freedom to explore new avenues in their respective field of research. It is given to just 18 of the country’s most promising individuals. Mahdi Soltanolkotabi, an assistant professor at the Ming Hsieh Department of Electrical Engineering, is one of the most recent recipients of the Packard Fellowship.
Modern science is built on mathematical foundations, but these formulas and algorithms don’t appear out of thin air. This is where Soltanolkotabi’s work comes in. His groundbreaking techniques have helped develop algorithms with wide-ranging uses and applications. Many of the learning problems in the world today rely on the kinds of algorithms that Soltanolkotabi is building. His work is particularly important, as many, even in the science and mathematics communities, do not fully understand these vitally-important mathematical building blocks.
The Packard Foundation
Every year, the Packard Foundation narrows down its award recipients from a list of about 100 individuals at 50 universities across the nation. These researchers and dedicated educators work in a variety of disciplines, from earth science to engineering. A board of specialists assembles each year to assess the candidates and narrow down the field to 18 winners. Each of these hard-working individuals then receive $875,000 over the course of five years to fund their work.
The Packard Foundation was established by David and Lucile Packard in 1964. Before they became prominent philanthropists, the couple grew a small electronics shop into one of the most successful information technology companies in the world. But even in the early days of managing their small business, Davide and Lucile Packard believed in helping the community. The accomplished this goal primarily through job creation and a caring management style. They didn’t just want their employees to work efficiently; they wanted them to live happy, fulfilling lives both at work and home. Today, the Packard Foundation continues their legacy by providing researchers and engineers with the funding they need to create a brighter future.
Current Events in Computer Science: Quantum Machine Learning Algorithms Could Solve Problems with Big Data
In other computer science news, quantum machine learning is having a major impact on the future of big data analysis. Researchers at Purdue University are now working to develop quantum machine learning algorithms. In essence, these algorithms analyze large amounts of data at a much faster speed than previous calculations. In theory, this work could allow for much faster access to, and analysis of big data.
Discovering New Materials
Equally exciting is the prospect of these algorithms advancing research in chemistry and other scientific fields of study. For these algorithms, the team at Purdue used something called a Quantum Boltzmann Machine. This is a type of neural network that helps organize large sets of data according to one or more assigned algorithms. The team has already put this machine to use screening molecules to help speed up the discovery of new materials.
Solar Farm Lifespan
Researchers are also enthusiastic about the potential benefits for the future of solar farms. Solar technology degrades overtime, and there is currently no way to determine the rate of degradation for a specific location. With quantum machine learning algorithms, researchers can determine how long a solar farm will last in a given geographic location, potentially allowing for the creation of more efficient and longer-lasting solar farms in the future.
If you’re wondering where much of this big data comes from, the answer can be found in PMUs. PMU stands for “Phasor Measured Unit.” It takes many of these PMUs spread across the grid to measure current and voltage. Needless to say, PMUs create a massive amount of data in a very short amount of time. This creates a challenge for those tasked with storing all of the acquired data.
It is also extremely difficult to analyze so much data without developing a new and ever-changing system for analysis. Currently, most companies use machine learning algorithms to get a better understanding of all the data they’ve collected. The future of big data analysis appears to be in quantum algorithms though, which could provide analysis that is much faster and larger in scope. However, even with all of the work that researchers are doing, quantum algorithms can only go so far until we have legitimate quantum computers at our disposal.
Current Events in Computer Science: Robots Analyze Data, Humans Tell Stories
Reuters, one of the largest international news agencies in world, has recently adopted a unique method for delivering news stories to their readers. Through the combination of machine learning and human creativity, Reuters is able to have the best of both worlds. In what executive director Reg Chua describes as a cybernetic newsroom, the team relies on tools like News Tracer and Lynx Insight.
People and Machines Working Together
In essence, Lynx Insight is a tool that recognizes trends, facts, and relevant news stories from a huge set of available data. It draws from various sources, and uses the aggregate data to determine which topics are the most relevant to readers. It also makes decisions regarding which facts would be most helpful for communicating these stories. In conjunction with algorithms developed by journalists at Reuters, Lynx Insight is able to provide cutting-edge, data driven information. It is also changing the way we think about modern journalism in the process.
While developing this cybernetic newsroom, Chau contemplated the strengths and weaknesses of both humans and robots. In the end, he concluded that machines excel in “speed, breadth, and computation analysis.” These strengths allow robots to process large amounts of information at a much faster rate than humans. Alternatively, humans have their owns strengths that robots lack. According to Chau, “we give directions to machines, we can give context to stories and we can deal with a non-data world, [and] we can get quotes better.” In short, machines handle the big picture stuff (or more accurately, the “big data” stuff), while humans fill in the blanks, making sure the details reflect reality and tell a compelling story.
By combing the best of both worlds, Chau has created a newsroom that is always current and up-to-date, without sacrificing journalistic integrity. At the moment, Lynx Insight only analyzes data for Reuters’ market coverage, but Chau hopes to change this in the future. This cybernetic newsroom could expand to other areas, like sports, where data analysis is vital to their coverage.
Though Lynx Insight has changed the way Reuters reports the news, it is not the first of its kind. Before Lynx Insight, there was News Tracer. Reuters journalists primarily used this tool to find relevant stories and reliable sources on Twitter. Before News Tracer, journalists might spend hours (or even days) tracking down a story via social media. Despite their best efforts, many journalists still end up with dead-end stories or unreliable sources. News Tracer helped cut down on the time needed to evaluate sources, by sifting through millions of tweets and analyzing breaking news stories and identifying poor sources.
The algorithm identifies potentially big news stories based on veracity and newsworthiness. For each potential story, News Tracer checks the facts against tweets concerning the same subject matter. Then, it determines the reliability of those accounts based on number of followers, inclusion of sources, etc. This is technically the same procedure that journalists follow when looking for a story. However, News Tracer covers a much larger volume of information at a much faster speed. Finally, the machine presents the information to journalists as a final verification before publication. Thus, the process works on multiple levels to confirm the reliability of every story and source.
In conclusion, these updates in computer science have enabled the world to solve many issues that human beings currently face. Subsequently, the field of computer science has provided solutions as well as employment opportunities to many individuals across the globe. New inventions in computer science also increase the speed at which we accomplish different tasks. Hopefully, these changes will usher in a brighter and more technologically-advanced future.