On the Design and Implementation of Static Analysis Tools

At Microsoft, we now regularly apply a new generation of static analysis tools that can automatically identify serious defects in programs. These tools examine millions of lines of code every day, long before the software is released for general use. With these tools, we catch more defects earlier in the software process, enabling Microsoft to deliver more reliable systems. A number of these tools have been released for general use through Microsoft's Visual Studio integrated development environment as well as freely available development kits.

Leverage Commonality at Model Level: Model Checking Smart Home Applications

Our group is building Smart Home applications for the Cognitively Impaired population. We have chosen to work with an existing framework, OSGi, which allows us to develop specific applications more quickly. We use a combination of traditional testing and formal verification to insure these applications will cause no harm to the cognitively impaired users of our systems.

Remove the Memory Wall: From performance modeling to architecture optimization

Data access is a known bottleneck of high performance computing (HPC). The prime sources of this bottleneck are the performance gap between the processor and memory storage and the large memory requirements of ever-hungry applications. Although advanced memory hierarchies and parallel file systems have been developed in recent years, they only provide high bandwidth for contiguous, well-formed data streams, performing poorly for accessing small, noncontiguous data.

The True Challenges of 21st Century Information Security R&D

Today's information security is no longer about keeping people out; it's about letting people in - the right people, the right time, to the right resources. Modern social and business practices require us to work closely together via access to the computing infrastructure and the Internet. Once connected, each needs to be brought directly to the right resources. In this respect, information security today is the key "enabler" that propels the next-generation paradigm shift. Traditional way of looking at information security as a protecting and prohibiting technology is out of date.

Personalized pedestrian navigation: user profile assessment under PC-RE framework

Personalization has been applied and researched in the field of adaptive user interface, e-commerce, and requirements engineering. Many personalized systems don't work well because of difficulty in inferring users' characteristics, particularly in the early stage of application usage. I argue that the alternative is to pay attention to users' abilities, goals, and preferences at the very early stage of requirements engineering.

Distinguished Lecture Series - On the Evolution of Adversary Models in Security Protocols

Invariably, new technologies introduce new vulnerabilities which often enable new attacks by increasingly potent adversaries. Yet new systems are more adept at handling well-known attacks by old adversaries than anticipating new ones. Our adversary models seem to be perpetually out of date: often they do not capture adversary attacks and sometimes they address attacks rendered impractical by new technologies.

Distinguished Lecture Series - New Style Parallel Programming

Parallel computers have been touted as the "next big thing" for three decades, but the software developers have been able to ignore it, largely by leveraging the ever increasing single-thread performance of modern microprocessors. This situation has completely changed within the last three years: flagship microprocessors from Intel, AMD and IBM are all "multicores". Now everyone needs a strategy for parallel programming.


Subscribe to RSS - Colloquium