Аннотация

Contextual Design is a user-centered design process that uses in-depth field research to drive innovative design. Contextual Design was first invented in 1988 and has since been used in a wide variety of industries and taught in universities all over the world. It is a complete front-end design process rooted in Contextual Inquiry, the widespread, industry-standard field data gathering technique. Contextual Design adds techniques to analyze and present user data, drive ideation from data, design specific product solutions, and iterate those solutions with customers.
In 2013, we overhauled the method to account for the way that technology has radically changed people’s lives since the invention of the touchscreen phones and other always-on, always-connected, and always-carried devices. This book describes the new Contextual Design, evolved to help teams design for the way technology now fits into peoples’ lives. We briefly describe the steps of the latest version of Contextual Design and show how they create a continual immersion in the world of the user for the purpose of innovative product design.
Table of Contents: Introduction / Design for Life / Field Research: Data Collection and Interpretation / Consolidation and Ideation: The Bridge to Design / Detailed Design and Validation / Conclusion / References / Author Biographies

Аннотация

This book is a comprehensive exposition of the thermodynamic properties of the van der Waals fluid, which evolved out of a course on thermodynamics and statistical mechanics at Iowa State University in the US. The main goal of the book is to provide a grap

Аннотация

This book summarizes the current hard problems in software testing as voiced by leading practitioners in the field. The problems were identified through a series of workshops, interviews, and surveys. Some of the problems are timeless, such as education and training, while others such as system security have recently emerged as increasingly important.
The book also provides an overview of the current state of Testing as a Service (TaaS) based on an exploration of existing commercial offerings and a survey of academic research. TaaS is a relatively new development that offers software testers the elastic computing capabilities and generous storage capacity of the cloud on an as-needed basis. Some of the potential benefits of TaaS include automated provisioning of test execution environments and support for rapid feedback in agile development via continuous regression testing.
The book includes a case study of a representative web application and three commercial TaaS tools to determine which hard problems in software testing are amenable to a TaaS solution. The findings suggest there remains a significant gap that must be addressed before TaaS can be fully embraced by the industry, particularly in the areas of tester education and training and a need for tools supporting more types of testing. The book includes a roadmap for enhancing TaaS to help bridge the gap between potential benefits and actual results.
Table of Contents: Introduction / Hard Problems in Software Testing / Testing as a Service (TaaS) / Case Study and Gap Analysis / Summary / Appendix A: Hard Problems in Software Testing Survey / Appendix B: Google App Engine Code Examples / Appendix C: Sauce Labs Code Examples / References / Author Biographies

Аннотация

The wireless medium is a shared resource. If nearby devices transmit at the
same time, their signals interfere, resulting in a collision. In traditional
networks, collisions cause the loss of the transmitted information. For this
reason, wireless networks have been designed with the assumption that
interference is intrinsically harmful and must be avoided.


This book, a revised version of the author's award-winning Ph.D.
dissertation, takes an alternate approach: Instead of viewing interference
as an inherently counterproductive phenomenon that should to be avoided, we
design practical systems that transform interference into a harmless, and
even a beneficial phenomenon. To achieve this goal, we consider how wireless
signals interact when they interfere, and use this understanding in our
system designs. Specifically, when interference occurs, the signals get
mixed on the wireless medium. By understanding the parameters of this
mixing, we can invert the mixing and decode the interfered packets; thus,
making interference harmless. Furthermore, we can control this mixing
process to create strategic interference that allow decodability at a
particular receiver of interest, but prevent decodability at unintended
receivers and adversaries. Hence, we can transform interference into a
beneficial phenomenon that provides security.


Building on this approach, we make four main contributions: We present the
first WiFi receiver that can successfully reconstruct the transmitted
information in the presence of packet collisions. Next, we introduce a WiFi
receiver design that can decode in the presence of high-power
cross-technology interference from devices like baby monitors, cordless
phones, microwave ovens, or even unknown technologies. We then show how we
can harness interference to improve security. In particular, we develop the
first system that secures an insecure medical implant without any
modification to the implant itself. Finally, we present a solution that
establishes secure connections between any two WiFi devices, without having
users enter passwords or use pre-shared secret keys.

Аннотация

As society rushes to digitize sensitive information and services, it is imperative to adopt adequate security protections. However, such protections fundamentally conflict with the benefits we expect from commodity computers. In other words, consumers and businesses value commodity computers because they provide good performance and an abundance of features at relatively low costs. Meanwhile, attempts to build secure systems from the ground up typically abandon such goals, and hence are seldom
adopted.


In this book, I argue that we can resolve the tension between security and features by leveraging the trust a user has in one device to enable her to securely use another commodity device or service, without sacrificing the performance and features expected of commodity systems. At a high level, we support this premise by developing techniques to allow a user to employ a small, trusted, portable device to securely learn what code is executing on her local computer. Rather than entrusting her data to the mountain of buggy code likely running on her computer, we construct an on-demand secure execution environment which can perform security-sensitive tasks and handle private data in complete isolation from all other software (and most hardware) on the system. Meanwhile, non-security-sensitive software retains the same abundance of features and performance it enjoys today.


Having established an environment for secure code execution on an individual computer, we then show how to extend trust in this environment to network elements in a secure and efficient manner. This allows us to reexamine the design of network protocols and defenses, since we can now execute code on endhosts and trust the results within the network. Lastly, we extend the user's trust one more step to encompass computations performed on a remote host (e.g., in the cloud). We design, analyze, and prove secure a protocol that allows a user to outsource arbitrary computations to commodity computers run by an untrusted remote party (or parties) who may subject the computers to both software and hardware attacks. Our protocol guarantees that the user can both verify that the results returned are indeed the correct results of the specified computations on the inputs provided, and protect the secrecy of both the inputs and outputs of the computations. These guarantees are provided in a non-interactive, asymptotically optimal (with respect to CPU and bandwidth) manner.


Thus, extending a user's trust, via software, hardware, and cryptographic techniques, allows us to provide strong security protections for both local and remote computations on sensitive data, while still preserving the performance and features of commodity computers.

Аннотация

The goal of this text is to introduce, in a very elementary way, the concept of anti-de Sitter/Conformal Field Theory (AdS/CFT) correspondence to condensed matter physicists. This theory relates a gravity theory in a (d+1)-dimensional anti-de Sitter space

Аннотация

This book is designed to help the non-specialist user of spectroscopic measurements and electronic structure computations to achieve a basic understanding of the underlying concepts of quantum chemistry. The book can be used to teach introductory quantum c

Аннотация

Personal Information Management (PIM) is the art of getting things done in our lives through information. How do we – can we better – manage our information at home, at school, at work, at play and “@large” in a global community? How do we use information not only to know but also to represent, communicate and effect useful change in the world around us? In the study of PIM, does the search for practical methods with practical impact lead to methods that are «massive open on-line»? Can the ancient practice of storytelling help us better to weave our fragmented information together? In the practice of PIM, how can our information best serve as «near knowledge» – close at hand and, through our information tools, serving in practical ways to extend the knowledge that's «in the head»? If attempts to multitask lead to ineffective, even dangerous, instances of task switching and divided attention, can better PIM help us to realize, instead, opportunities for «multi-goaling» where the same time and effort accomplishes not just one but several goals? These and other questions are addressed in this third and final book to conclude the series on «The Future of Personal Information Management». Part 1, «Our Information, Always and Forever», covered the fundamentals of PIM and then explored the seismic shift, already well underway, towards a world where our information is always at hand (thanks to our devices) and «forever» on the web. Part 2, «Transforming Technologies to Manage Our Information», provided a more focused look at technologies for managing information. The opening chapter discussed «natural interface» technologies of input/output to free us from keyboard, screen and mouse. Successive chapters then explored technologies to save, search and structure our information. A concluding chapter introduced the possibility that we may see dramatic reductions in the «clerical tax» we pay as we work with our information. Now in Part 3, «Building a Better World with Our Information», focus shifts to the practical present and to the near future. Part 3 is in three chapters: • Group information management and the social fabric in PIM. How do we preserve and promote our PIM practices as we interact with others at home, at work, at play and in wider, even global, communities? (Chapter 10). • Designing for PIM in the development of tools and in the selection of teachable (learnable) «better practices» of PIM. (Chapter 11). • To each of us, our own concludes with an exploration of the ways each of us, individually, can develop better practices for the management of our information in service of the lives we wish to live and towards a better world we all must share. (Chapter 12).

Аннотация

Alzheimer’s Disease is characterized pathologically by two principal hallmark lesions: the senile plaque and the neurofibrillary tangle. Since the identification of each over 100 years ago, the major protein components have been elucidated. This has led in turn to the elaboration of metabolic cascades involving amyloid-β production in the case of the senile plaque, and phosphorylated-tau protein in the case of the neurofibrillary tangle. The pathogenesis and histogenesis of each have been the source of extensive investigation and some controversy in recent years, as both cascades have been implicated in the pathogenesis of Alzheimer’s Disease, relied upon in the diagnostic criteria for Alzheimer’s Disease at autopsy, and targeted for therapeutic intervention. With the accumulation of data and expansion of knowledge of the molecular biology of Alzheimer’s Disease, it appears that the enthusiasm for successful intervention has been premature. In this book, we detail the discovery and characterization of the major pathological lesions, their associated molecular biology, their relationship to clinical disease, and potential fundamental errors in understanding that may be leading scientific investigators in unintended directions.

Аннотация

One of the major scientific thrusts in recent years has been to try to harness quantum phenomena to increase dramatically the performance of a wide variety of classical information processing devices. In particular, it is generally accepted that quantum co