Modern supercomputer architectures have grown increasingly complex and diverse since the end of Moore's law in the mid-2000s, and are far more difficult to program than their earlier counterparts. While HPC programming models have improved such that applications are now generally portable between architectures, their performance can still vary wildly, and developers now need to spend a great deal of time tuning or even rewriting their applications for each new machine to get the performance they need.
The widespread use of machine learning, coupled with large datasets and increasingly complex models have exposed a general lack of understanding for how individual predictions are made. It is perhaps unsurprising then that explainable AI (XAI) has become a very popular research topic over the past several years.
In this paper, we review the state of Internet of Things (IoT) security research, with a focus on recent countermeasures that attempt to address vulnerabilities and attacks in IoT networks. Due to the fact that IoT encompasses a large range of significantly distinct environments, each of which merits their own survey, our survey focuses mainly on the smart home environment. Based on the papers surveyed, we pinpoint several challenges and open issues that have yet to be adequately addressed in the realm of IoT security research.
Two decades after the first distributed denial-of-service (DDoS) attack, the Internet continues to face DDoS attacks. To understand why DDoS defense is a difficult problem, we must study how the attacks are carried out and whether the existing defense solutions are sufficient. In this work, we review the latest DDoS attacks and DDoS defense solutions. In particular, we focus on the key advancements and missing pieces in DDoS research.
We consider the problem of automatic camera selection in the context of in situ visualization. This problem is important because high-performance computing trends are increasingly mandating in situ processing, and this processing paradigm frequently has no human-in-the-loop — new research is needed to automate the decisions that have previously been made by human beings. We begin by briefly evaluating what makes an image good, i.e., informative, pleasing, etc.
Non-functional requirements of typical applications tend to get less attention during software development compared to functional requirements. Software performance, in particular, is one that gets less attention during development, but ahead of shipping apparent performance ﬂaws must be ﬁxed. Dynamic software performance analysis attempts to assist developers locating performance ﬂaws or conﬁrm their understanding of the overall performance behavior.
Flow visualization is a vital component in the workflow of studying computational fluid dynamic simulations. Integral curves or streamlines are one of the most commonly used techniques to visualize flow fields and selecting a good set of streamlines is viewed as a challenge. Identifying a representative set of streamlines that captures the flow behavior can be achieved by either strategically placing seed points or selecting a subset of precomputed streamlines that exhibit desired properties.
Supercomputers increase both computing power and available memory. This allows scientists to generate high resolution physics-based simulations. Most of these simulations produce a massive amount of data, resulting in potentially trillions of cells. Scientific visualization is an essential method for understanding this simulation data. Visualization algorithms are usually run on supercomputers to leverage additional memory and computational power.
High performance computing is an important asset to scientific research, enabling the study of phenomena, such as nuclear physics or climate change, that are difficult or impossible to be studied in traditional experiments or allowing researchers to utilize large amounts of data from experiments such as the Large Hadron Collider. No matter the use of HPC, the need for performance is always present; however, the fast-changing nature of computer systems means that software must be continually updated to run efficiently on the newest machines. In this paper, we discuss method
Given the importance of the Internet, it is crucial to assess its key characteristics (e.g. performance, stability, and resiliency) through measurement as it expands and evolves over time. Measuring different characteristics of the Internet is challenging mainly due to its scale and heterogeneity.