Аннотация

The new field of cryptographic currencies and consensus ledgers, commonly referred to as <i>blockchains</i>, is receiving increasing interest from various different communities. These communities are very diverse and amongst others include: technical enthusiasts, activist groups, researchers from various disciplines, start ups, large enterprises, public authorities, banks, financial regulators, business men, investors, and also criminals. The scientific community adapted relatively slowly to this emerging and fast-moving field of cryptographic currencies and consensus ledgers. This was one reason that, for quite a while,the only resources available have been the Bitcoin source code, blog and forum posts, mailing lists, and other online publications. Also the original Bitcoin paper which initiated the hype was published online without any prior peer review. Following the original publication spirit of the Bitcoin paper, a lot of innovation in this field has repeatedly come from the community itself in the form of online publications and online conversations instead of established peer-reviewed scientific publishing. On the one side, this spirit of fast free software development, combined with the business aspects of cryptographic currencies, as well as the interests of today's time-to-market focused industry, produced a flood of publications, whitepapers, and prototypes. On the other side, this has led to deficits in systematization and a gap between practice and the theoretical understanding of this new field. This book aims to further close this gap and presentsa well-structured overview of this broad field from a technical viewpoint. The archetype for modern cryptographic currencies and consensus ledgers is Bitcoin and its underlying Nakamoto consensus. Therefore we describe the inner workings of this protocol in great detail and discuss its relations to other derived systems.

Аннотация

This is the first comprehensive history of human-computer interaction (HCI). Whether you are a user experience professional or an academic researcher, whether you identify with computer science, human factors, information systems, information science, design, or communication, you can discover how your experiences fit into the expanding field of HCI. You can determine where to look for relevant information in other fields—and where you won’t find it. This book describes the different fields that have participated in improving our digital tools. It is organized chronologically, describing major developments across fields in each period. Computer use has changed radically, but many underlying forces are constant. Technology has changed rapidly, human nature very little. An irresistible force meets an immovable object. The exponential rate of technological change gives us little time to react before technology moves on. Patterns and trajectories described in this book provide your best chance to anticipate what could come next. We have reached a turning point. Tools that we built for ourselves to use are increasingly influencing how we use them, in ways that are planned and sometimes unplanned. The book ends with issues worthy of consideration as we explore the new world that we and our digital partners are shaping.

Аннотация

Communities of Computing is the first book-length history of the Association for Computing Machinery (ACM), founded in 1947 and with a membership today of 100,000 worldwide. It profiles ACM's notable SIGs, active chapters, and individual members, setting ACM's history into a rich social and political context. The book's 12 core chapters are organized into three thematic sections. «Defining the Discipline» examines the 1960s and 1970s when the field of computer science was taking form at the National Science Foundation, Stanford University, and through ACM's notable efforts in education and curriculum standards. «Broadening the Profession» looks outward into the wider society as ACM engaged with social and political issues – and as members struggled with balancing a focus on scientific issues and awareness of the wider world. Chapters examine the social turbulence surrounding the Vietnam War, debates about the women's movement, efforts for computing and community education, and international issues including professionalization and the Cold War. «Expanding Research Frontiers» profiles three areas of research activity where ACM members and ACM itself shaped notable advances in computing, including computer graphics, computer security, and hypertext. Featuring insightful profiles of notable ACM leaders, such as Edmund Berkeley, George Forsythe, Jean Sammet, Peter Denning, and Kelly Gotlieb, and honest assessments of controversial episodes, the volume deals with compelling and complex issues involving ACM and computing. It is not a narrow organizational history of ACM committees and SIGS, although much information about them is given. All chapters are original works of research. Many chapters draw on archival records of ACM's headquarters, ACM SIGs, and ACM leaders. This volume makes a permanent contribution to documenting the history of ACM and understanding its central role in the history of computing.

Аннотация

Edmund C. Berkeley (1909 – 1988) was a mathematician, insurance actuary, inventor, publisher, and a founder of the Association for Computing Machinery (ACM). His book Giant Brains or Machines That Think (1949) was the first explanation of computers for a general readership. His journal Computers and Automation (1951-1973) was the first journal for computer professionals. In the 1950s, Berkeley developed mail-order kits for small, personal computers such as Simple Simon and the Braniac. In an era when computer development was on a scale barely affordable by universities or government agencies, Berkeley took a different approach and sold simple computer kits to average Americans. He believed that digital computers, using mechanized reasoning based on symbolic logic, could help people make more rational decisions. The result of this improved reasoning would be better social conditions and fewer large-scale wars. Although Berkeley’s populist notions of computer development in the public interest did not prevail, the events of his life exemplify the human side of ongoing debates concerning the social responsibility of computer professionals.
This biography of Edmund Berkeley, based on primary sources gathered over 15 years of archival research, provides a lens to understand social and political decisions surrounding early computer development, and the consequences of these decisions in our 21st century lives.

Аннотация

The time-worn aphorism «close only counts in horseshoes and hand grenades» is clearly inadequate. Close also counts in golf, shuffleboard, archery, darts, curling, and other games of accuracy in which hitting the precise center of the target isn't to be expected every time, or in which we can expect to be driven from the target by skilled opponents.
This book is not devoted to sports discussions, but to efficient algorithms for determining pairs of closely related web pages—and a few other situations in which we have found that inexact matching is good enough – where proximity suffices. We will not, however, attempt to be comprehensive in the investigation of probabilistic algorithms, approximation algorithms, or even techniques for organizing the discovery of nearest neighbors. We are more concerned with finding nearby neighbors; if they are not particularly close by, we are not particularly interested.
In thinking of when approximation is sufficient, remember the oft-told joke about two campers sitting around after dinner. They hear noises coming towards them. One of them reaches for a pair of running shoes, and starts to don them. The second then notes that even with running shoes, they cannot hope to outrun a bear, to which the first notes that most likely the bear will be satiated after catching the slower of them. We seek problems in which we don't need to be faster than the bear, just faster than the others fleeing the bear.

Аннотация

Ada’s Legacy illustrates the depth and diversity of writers, thinkers, and makers who have been inspired by Ada Lovelace, the English mathematician and writer. The volume, which commemorates the bicentennial of Ada’s birth in December 1815, celebrates Lovelace’s many achievements as well as the impact of her life and work, which reverberated widely since the late nineteenth century. In the 21st century we have seen a resurgence in Lovelace scholarship, thanks to the growth of interdisciplinary thinking and the expanding influence of women in science, technology, engineering and mathematics. Ada’s Legacy is a unique contribution to this scholarship, thanks to its combination of papers on Ada’s collaboration with Charles Babbage, Ada’s position in the Victorian and Steampunk literary genres, Ada’s representation in and inspiration of contemporary art and comics, and Ada’s continued relevance in discussions around gender and technology in the digital age.
With the 200th anniversary of Ada Lovelace’s birth on December 10, 2015, we believe that the timing is perfect to publish this collection of papers. Because of its broad focus on subjects that reach far beyond the life and work of Ada herself, Ada’s Legacy will appeal to readers who are curious about Ada’s enduring importance in computing and the wider world.

Аннотация

The aim of cryptography is to design primitives and protocols that withstand adversarial behavior. Information theoretic cryptography, how-so-ever desirable, is extremely restrictive and most non-trivial cryptographic tasks are known to be information theoretically impossible. In order to realize sophisticated cryptographic primitives, we forgo information theoretic security and assume limitations on what can be efficiently computed. In other words we attempt to build secure systems conditioned on some computational intractability assumption such as factoring, discrete log, decisional Diffie-Hellman, learning with errors, and many more.
In this work, based on the 2013 ACM Doctoral Dissertation Award-winning thesis, we put forth new plausible lattice-based constructions with properties that approximate the sought after multilinear maps. The multilinear analog of the decision Diffie-Hellman problem appears to be hard in our construction, and this allows for their use in cryptography. These constructions open doors to providing solutions to a number of important open problems.

Аннотация

Smarter Than Their Machines: Oral Histories of the Pioneers of Interactive Computing is based on oral histories archived at the Charles Babbage Institute, University of Minnesota. Included are the oral histories of some key pioneers of the computer industry selected by John that led to interactive computing, such as Richard Bloch, Gene Amdahl, Herbert W. Robinson, Sam Wyly, J.C.R. Licklider, Ivan Sutherland, Larry Roberts, Robert Kahn, Marvin Minsky, Michael Dertouzos, and Joseph Traub, as well as his own. John has woven them together via introductions that is, in essence, a personal walk down the computer industry road. John had the unique advantage of having been part of, or witness to, much of the history contained in these oral histories beginning as a co-op student at Arthur D. Little, Inc., in the 1950’s. Eventually, he would become a pioneer in his own right by creating the computer industry's first successful software products company (Cullinane Corporation). However, an added benefit of reading these oral histories is that they contain important messages for our leaders of today, at all levels, including that government, industry, and academia can accomplish great things when working together in an effective way. This is how the computer industry was created, which then led to the Internet, both totally unanticipated just 75 years ago.

Аннотация

The wireless medium is a shared resource. If nearby devices transmit at the
same time, their signals interfere, resulting in a collision. In traditional
networks, collisions cause the loss of the transmitted information. For this
reason, wireless networks have been designed with the assumption that
interference is intrinsically harmful and must be avoided.


This book, a revised version of the author's award-winning Ph.D.
dissertation, takes an alternate approach: Instead of viewing interference
as an inherently counterproductive phenomenon that should to be avoided, we
design practical systems that transform interference into a harmless, and
even a beneficial phenomenon. To achieve this goal, we consider how wireless
signals interact when they interfere, and use this understanding in our
system designs. Specifically, when interference occurs, the signals get
mixed on the wireless medium. By understanding the parameters of this
mixing, we can invert the mixing and decode the interfered packets; thus,
making interference harmless. Furthermore, we can control this mixing
process to create strategic interference that allow decodability at a
particular receiver of interest, but prevent decodability at unintended
receivers and adversaries. Hence, we can transform interference into a
beneficial phenomenon that provides security.


Building on this approach, we make four main contributions: We present the
first WiFi receiver that can successfully reconstruct the transmitted
information in the presence of packet collisions. Next, we introduce a WiFi
receiver design that can decode in the presence of high-power
cross-technology interference from devices like baby monitors, cordless
phones, microwave ovens, or even unknown technologies. We then show how we
can harness interference to improve security. In particular, we develop the
first system that secures an insecure medical implant without any
modification to the implant itself. Finally, we present a solution that
establishes secure connections between any two WiFi devices, without having
users enter passwords or use pre-shared secret keys.

Аннотация

As society rushes to digitize sensitive information and services, it is imperative to adopt adequate security protections. However, such protections fundamentally conflict with the benefits we expect from commodity computers. In other words, consumers and businesses value commodity computers because they provide good performance and an abundance of features at relatively low costs. Meanwhile, attempts to build secure systems from the ground up typically abandon such goals, and hence are seldom
adopted.


In this book, I argue that we can resolve the tension between security and features by leveraging the trust a user has in one device to enable her to securely use another commodity device or service, without sacrificing the performance and features expected of commodity systems. At a high level, we support this premise by developing techniques to allow a user to employ a small, trusted, portable device to securely learn what code is executing on her local computer. Rather than entrusting her data to the mountain of buggy code likely running on her computer, we construct an on-demand secure execution environment which can perform security-sensitive tasks and handle private data in complete isolation from all other software (and most hardware) on the system. Meanwhile, non-security-sensitive software retains the same abundance of features and performance it enjoys today.


Having established an environment for secure code execution on an individual computer, we then show how to extend trust in this environment to network elements in a secure and efficient manner. This allows us to reexamine the design of network protocols and defenses, since we can now execute code on endhosts and trust the results within the network. Lastly, we extend the user's trust one more step to encompass computations performed on a remote host (e.g., in the cloud). We design, analyze, and prove secure a protocol that allows a user to outsource arbitrary computations to commodity computers run by an untrusted remote party (or parties) who may subject the computers to both software and hardware attacks. Our protocol guarantees that the user can both verify that the results returned are indeed the correct results of the specified computations on the inputs provided, and protect the secrecy of both the inputs and outputs of the computations. These guarantees are provided in a non-interactive, asymptotically optimal (with respect to CPU and bandwidth) manner.


Thus, extending a user's trust, via software, hardware, and cryptographic techniques, allows us to provide strong security protections for both local and remote computations on sensitive data, while still preserving the performance and features of commodity computers.