5 Current Events in Computer Science

Share on tumblr
Share on facebook
Share on linkedin
Share on twitter
Share on stumbleupon
Share on email
Share on pinterest

5 Current Events in Computer Science

Computer science

Computer science is the study of how computers are used and designed. This subject involves a lot of different elements and approaches related to theory, engineering and experimentation. In addition, it includes the exploration of algorithms that store, process, and communicate digital information. These theories and computations have helped solved everyday challenges for millions of people. However, the field of computer science is always evolving. There are new discoveries or variations of existing principles that come to light on a daily basis. Here are just a few of the most important current events in computer science:

Deep Learning

AI applications powered by neural networks are already assisting research in various fields, from climatology to particle physics. Last year’s Supercomputing show in Dallas, Texas demonstrated some of these applications. Many companies and experts in the field showed how neural network-powered AI can lead to major shifts in how we think about and conduct business. Both scientists and AI enthusiasts have taken a renewed interest in supercomputers, with companies like NVIDIA taking the first steps in this deep learning revolution.

Supercomputers have also helped build enthusiasm and encourage more funding for research. This is largely due to the fact that deep learning has changed how PC researchers plan and manufacture the hardware. GPU producer NVIDIA is at the forefront of this computing revolution. The company used the SC’18 to show the scope and potential of its work.

HPC Defense Capability

The Air Force Research Laboratory (AFRL) has been hard at work, having just unveiled 4 new computer clusters. According to the official announcement, these computer clusters are capable of providing the first “Shared Above-Secret Department of Defense High Performance Computing Capability.” The largest of the 4 computers is called “Mustang,” a reference to an old World War II era warplane. This computer cost approximately $15 million to produce, and has 2,352 Intel Platinum 8168 CPUs, as well as 24 Nvidia Tesla P100 GPUs. For the uninitiated, that makes for one extremely powerful computer.

Mustang, along with the 4 other computers (also named after various aircrafts) will form the Department of Defense’s first supercomputing center. According to the director of the DoD Supercomputing Resource Center (DSRC), Jeff Graham:

“AFRL has been at the forefront of the effort to establish this capability for the DoD. It shows our commitment to advancing computational tools being used to support the warfighter. The ability to share supercomputers at higher classification levels will allow programs to get their supercomputing work done quickly while maintaining necessary security. Programs will not need to spend their budget and waste time constructing their own secure computer facilities, and buying and accrediting smaller computers for short-term work. This new capability will save billions for the DoD while providing additional access to state-of-the-art computing.”

The DoD sees this supercomputing system as a necessary step for moving the nation’s defense system into the 21st century, and beyond. As it currently stands, the supercomputing network links multiple government agencies, providing them with previously unseen computing power. However, it is still unclear exactly how this system will be implemented across the entire US government.

Signal Processing and Artificial Intelligence

The Packard Fellowship for Science and Engineering is both a prestigious award and a highly sought after asset for many young researchers and engineers. The award provides scientists and engineers with enough funding to continue their work, and freedom to explore new avenues in their respective field of research. It is given to just 18 of the country’s most promising individuals. Mahdi Soltanolkotabi, an assistant professor at the Ming Hsieh Department of Electrical Engineering, is one of the most recent recipients of the Packard Fellowship.

Modern science is built on mathematical foundations, but these formulas and algorithms don’t appear out of thin air. This is where Soltanolkotabi’s work comes in. His groundbreaking techniques have helped develop algorithms with wide-ranging uses and applications. Many of the learning problems in the world today rely on the kinds of algorithms that Soltanolkotabi is building. His work is particularly important, as many, even in the science and mathematics communities, do not fully understand these vitally-important mathematical building blocks.

The Packard Foundation

Every year, the Packard Foundation narrows down its award recipients from a list of about 100 individuals at 50 universities across the nation. These researchers and dedicated educators work in a variety of disciplines, from earth science to engineering. A board of specialists assembles each year to assess the candidates and narrow down the field to 18 winners. Each of these hard-working individuals then receive $875,000 over the course of five years to fund their work.

The Packard Foundation was established by David and Lucile Packard in 1964. Before they became prominent philanthropists, the couple grew a small electronics shop into one of the most successful information technology companies in the world. But even in the early days of managing their small business, Davide and Lucile Packard believed in helping the community. The accomplished this goal primarily through job creation and a caring management style. They didn’t just want their employees to work efficiently; they wanted them to live happy, fulfilling lives both at work and home. Today, the Packard Foundation continues their legacy by providing researchers and engineers with the funding they need to create a brighter future.

Quantum Machine Learning Algorithms Could Solve Problems with Big Data

In other computer science news, quantum machine learning is having a major impact on the future of big data analysis. Researchers at Purdue University are now working to develop quantum machine learning algorithms. In essence, these algorithms analyze large amounts of data at a much faster speed than previous calculations. In theory, this work could allow for much faster access to, and analysis of big data.

Discovering New Materials

Equally exciting is the prospect of these algorithms advancing research in chemistry and other scientific fields of study. For these algorithms, the team at Purdue used something called a Quantum Boltzmann Machine. This is a type of neural network that helps organize large sets of data according to one or more assigned algorithms. The team has already put this machine to use screening molecules to help speed up the discovery of new materials.

Solar Farm Lifespan

Researchers are also enthusiastic about the potential benefits for the future of solar farms. Solar technology degrades overtime, and there is currently no way to determine the rate of degradation for a specific location. With quantum machine learning algorithms, researchers can determine how long a solar farm will last in a given geographic location, potentially allowing for the creation of more efficient and longer-lasting solar farms in the future.

PMUs

If you’re wondering where much of this big data comes from, the answer can be found in PMUs. PMU stands for “Phasor Measured Unit.” It takes many of these PMUs spread across the grid to measure current and voltage. Needless to say, PMUs create a massive amount of data in a very short amount of time. This creates a challenge for those tasked with storing all of the acquired data.

It is also extremely difficult to analyze so much data without developing a new and ever-changing system for analysis. Currently, most companies use machine learning algorithms to get a better understanding of all the data they’ve collected. The future of big data analysis appears to be in quantum algorithms though, which could provide analysis that is much faster and larger in scope. However, even with all of the work that researchers are doing, quantum algorithms can only go so far until we have legitimate quantum computers at our disposal.

Robots Analyze Data, Humans Tell Stories

Reuters, one of the largest international news agencies in world, has recently adopted a unique method for delivering news stories to their readers. Through the combination of machine learning and human creativity, Reuters is able to have the best of both worlds. In what executive director Reg Chua describes as a cybernetic newsroom, the team relies on tools like News Tracer and Lynx Insight.

People and Machines Working Together

In essence, Lynx Insight is a tool that recognizes trends, facts, and relevant news stories from a huge set of available data. It draws from various sources, and uses the aggregate data to determine which topics are the most relevant to readers. It also makes decisions regarding which facts would be most helpful for communicating these stories. In conjunction with algorithms developed by journalists at Reuters, Lynx Insight is able to provide cutting-edge, data driven information. It is also changing the way we think about modern journalism in the process.

While developing this cybernetic newsroom, Chau contemplated the strengths and weaknesses of both humans and robots. In the end, he concluded that machines excel in “speed, breadth, and computation analysis.” These strengths allow robots to process large amounts of information at a much faster rate than humans. Alternatively, humans have their owns strengths that robots lack. According to Chau, “we give directions to machines, we can give context to stories and we can deal with a non-data world, [and] we can get quotes better.” In short, machines handle the big picture stuff (or more accurately, the “big data” stuff), while humans fill in the blanks, making sure the details reflect reality and tell a compelling story.

By combing the best of both worlds, Chau has created a newsroom that is always current and up-to-date, without sacrificing journalistic integrity. At the moment, Lynx Insight only analyzes data for Reuters’ market coverage, but Chau hopes to change this in the future. This cybernetic newsroom could expand to other areas, like sports, where data analysis is vital to their coverage.

News Tracer

Though Lynx Insight has changed the way Reuters reports the news, it is not the first of its kind. Before Lynx Insight, there was News Tracer. Reuters journalists primarily used this tool to find relevant stories and reliable sources on Twitter. Before News Tracer, journalists might spend hours (or even days) tracking down a story via social media. Despite their best efforts, many journalists still end up with dead-end stories or unreliable sources. News Tracer helped cut down on the time needed to evaluate sources, by sifting through millions of tweets and analyzing breaking news stories and identifying poor sources.

The algorithm identifies potentially big news stories based on veracity and newsworthiness. For each potential story, News Tracer checks the facts against tweets concerning the same subject matter. Then, it determines the reliability of those accounts based on number of followers, inclusion of sources, etc. This is technically the same procedure that journalists follow when looking for a story. However, News Tracer covers a much larger volume of information at a much faster speed. Finally, the machine presents the information to journalists as a final verification before publication. Thus, the process works on multiple levels to confirm the reliability of every story and source.

Bottom line

In conclusion, these updates in computer science have enabled the world to solve many issues that human beings currently face. Subsequently, the field of computer science has provided solutions as well as employment opportunities to many individuals across the globe. New inventions in computer science also increase the speed at which we accomplish different tasks. Hopefully, these changes will usher in a brighter and more technologically-advanced future.

Leave a Comment