[1] | W. Polak. Anwendung der axiomatischen Definitionsmethode auf höhere Programmiersprachen. In H.-J. Schneider and M. Nagl, editors, Programmiersprachen, 4. Fachtagung der GI, volume 1 of Informatik-Fachberichte, pages 12--18, Berlin, March 1976. Springer Verlag. [ bib | http ] |
[2] |
Wolfgang Polak.
An exercise in automatic program verification.
IEEE Transactions of Software Engineering, SE--5(5):453--458,
September 1979.
[ bib |
http ]
This paper describes in some detail the computer-aided proof of a permutation program obtained using the Stanford Pascal verifier. We emphasize the systematic way in which a proof can be developed from an intuitive understanding of the program. The paper illustrates the current state of the art in automatic program verification. |
[3] |
David C. Luckham, Steven M. German, Friedrich W. von Henke, Richard A. Karp,
P. W. Milne, Derek C. Oppen, Wolfgang Polak, and William L. Scherlis.
Stanford pascal verifier user manual.
Technical Report STAN-CS-TR-79-731, Stanford University, Department
of Computer Science, March 1979.
[ bib ]
The Stanford PASCAL verifier is an interactive program verification system. It automates much of the work necessary to analyze a program for consistency with its documentation, and to give a rigorous mathematical proof of such consistency or to pin-point areas of inconsistency. It has been shown to have applications as an aid to programming, and to have potential for development as a new and useful tool in the production of reliable software. |
[4] |
David C. Luckham and Wolfgang Polak.
Ada exceptions: specification and proof techniques.
Technical Report STAN-CS-TR-80-789, Stanford University, Department
of Computer Science, February 1980.
[ bib |
.pdf ]
A method of documenting exception propagation and handling in Ada programs is proposed. Exception propagation declarations are introduced as a new component of Ada specifications. This permits documentation of those exceptions that can be propagated by a subprogram. Exception handlers are documented by entry assertions. Axioms and proof rules for Ada exceptions are given. These rules are simple extensions of previous rules for Pascal and define an axiomatic semantics of Ada exceptions. As a result, Ada programs specified according to the method can be analysed by formal proof techniques for consistency with their specifications, even if they employ exception propagation and handling to achieve required results (i.e. non-error situations). Example verifications are given. |
[5] |
David C. Luckham and W. Polak.
Ada exception handling: an axiomatic approach.
ACM Transactions on Programming Languages and Systems,
2(2):225--233, April 1980.
[ bib |
http ]
A method of documenting exception propagation and handling in Ada programs is proposed. Exception propagation declarations are introduced as a new component of Ada specifications, permitting documentation of those exceptions that can be propagated by a subprogram. Exception handlers are documented by entry assertions. Axioms and proof rules for Ada exceptions given. These rules are simple extensions of previous rules for Pascal and define an axiomatic semantics of Ada exceptions. As a result, Ada programs specified according to the method can be analyzed by formal proof techniques for consistency with their specifications, even if they employ exception propagation and handling to achieve required results (i.e., nonerror situations). Example verifications are given. |
[6] |
Wolfgang Heinz Polak.
Theory of compiler specification and verification.
Technical Report STAN-CS-80-802, Stanford University, Stanford, CA,
May 1980.
[ bib ]
The formal specification, design, implementation, and verification of a compiler for a Pascal-like language is described. All components of the compilation process such as scanning, parsing, type checking, and code generation are considered. The implemented language contains most control structures of Pascal, recursive procedures and functions, and jumps. It provides user defined data types including arrays, records, and pointers. A simple facility for input-output is provided. The target language assumes a stack machine including a display mechanism to handle procedure and function calls. The compiler itself is written in Pascal Plus, a dialect of Pascal accepted by the Stanford verifier. The Stanford verifier is used to give a complete formal machine checked verification of the compiler. One of the main problem areas considered is the formal mathematical treatment of programming languages and compilers suitable as input for automated program verification systems. Several technical and methodological problems of mechanically verifying large software systems are considered. Some new verification techniques are developed, notably methods to reason about pointers, fixed points, and quantification. These techniques are of general importance and are not limited to compiler verification. The result of this research demonstrates that construction of large correct programs is possible with the existing verification technology. It indicates that verification will become a useful software engineering tool in the future. Several problem areas of current verification systems are pointed out and areas for future research are outlined. |
[7] |
David C. Luckham and Wolfgang Polak.
A practical method of documenting and verifying Ada programs with
packages.
1980 SIGPlan Symposiun on the Ada Programing Language, pages
113--122, December 1980.
[ bib |
http ]
We present a method of formal specification of Ada programs containing packages. The method suggests concepts and guidelines useful for giving adequate informal documentation of packages by means of comments.The method depends on (1) the standard inductive assertion technique for subprograms, (2) the use of history sequences in assertions specifying the declaration and use of packages, and (3) the addition of three categories of specifications to Ada package declarations: (a) visible specifications, (b) boundary specifications, (c) internal specifications.Axioms and proof rules for the Ada package constructs (declaration, instantiation, and function and procedure call) are given in terms of history sequences and package specifications. These enable us to construct formal proofs of the correctness of Ada programs with packages. The axioms and proof rules are easy to implement in automated program checking systems. The use of history sequences in both in formal documentation and formal specifications and proofs is illustrated by examples. |
[8] |
Wolfgang Polak.
Program verification based on denotational semantics.
In Conference Record of the Eighth ACM Symposium on Principles
of Programming Languages, pages 149--158. ACM, January 1981.
[ bib |
.pdf ]
A theory of partial correctness proofs is formulated in Scott's logic computable junctions. This theory allows mechanical construction of verification condition solely on the basis of a denotational language definition. Extensionally these conditions, the resulting proofs, and the required program augmentation are similar to those of Hoare style proofs; conventional input, output, and invariant assertions in a first order assertion language are required. The theory applies to almost any sequential language defined by a continuation semantics; for example, there are no restrictions on aliasing or side-effects. Aspects of "static semantics",such as type and declaration constraints, which are expressed in the denotational definition are validated as part of the verification condition generation process. |
[9] | W. Polak. Program verification at stanford - past, present, future. In J. Siekman, editor, Workshop on Artificial Intelligence, volume 47 of Informatik-Fachberichte, pages 256--276, Berlin, 1981. Springer Verlag. [ bib | http ] |
[10] | W. Polak. Compiler Specification and Verification, volume 124 of Lecture Notes in Computer Science. Springer, Berlin, 1981. [ bib | http ] |
[11] |
Wolfgang Polak.
Framework for a knowledge-based programming environment.
In Int'l Workshop on Programming Environments, volume 244 of
Lecture Notes in Computer Science, pages 566--574, Trondheim, Norway,
June 1986. Springer Verlag.
[ bib ]
We develop a formal model describing the basic objects and operations of the software development process. This model allows us to characterize functions performed by support environments and to formally define the semantics of environment tools. Representing information about available tools, administrative procedures, and constraints in a knowledge base allows automated support for many software development and life-cycle activities. Constraint maintenance within a knowledge base takes over many routine administrative tasks; program synthesis techniques can be used to generate complex plans. |
[12] | Wolfgang Polak. AI approaches to software engineering. Workshop on Complementary Approaches to Software Engineering, Imperial College, London, June 1986. [ bib ] |
[13] |
Wolfgang Polak.
Maintainability and reusable program designs.
In Conf. on Software Reusability and Maintainability, Tysons
Corner, VA, September 1986.
[ bib ]
Problems and shortcomings of the current maintenance paradigm and the approach to software reuse are analyzed. We argue that knowledge-based transformation techniques will provide solutions in both areas. Such techniques will enable maintenance to be performed on the specification level. Efficient code can be regenerated from changed specifications by replaying recorded design decisions. Program transformations can be used capture and represent general as well as application specific programming knowledge. Reuse of such knowledge is the most flexible and general form of reuse. |
[14] |
Allen Goldberg, Cordell Green, Wolfgang Polak, and Richard Juellig.
Iteration in the software process.
In M. Dowson, editor, Proceedings of the 3rd
International Software Process Workshop, pages 105--108, November 1986.
[ bib |
http ]
In this position paper we discuss the model of software development adopted by Kestrel Institute, how iteration relates to this model, what knowledge-based tools have been developed that support iteration and specific, ongoing research on automating formal program developments. |
[15] |
Wolfgang Polak.
Framework for a knowledge-based programming environment.
Technical Report KES.U.86.3, Kestrel Institute, June 1987.
[ bib ]
A formal model for describing the semantics of operations in software engineering environments is presented. The proposed model consists of two parts, a description of the data domains, and an axiomatic definition of tool invocations and their composition. Objects in environments are characterized by their hierarchical composition and by dependencies between their sub-components. A general data model is developed, allowing uniform description of such diverse domains as source programs, requirements, and documentation. A first-order assertion language is introduced. The semantics of tools are described with pre and post conditions in this assertion language. Further, environment goals, such as “release a system x for a target y with property z” can be stated formally. An agenda is a partially ordered set of tool invocations and user actions. A language for describing agendas is defined. A formal semantics for this language is given by set of axiomatic proof rules. Program synthesis and planning techniques can be used to generate an optimal agenda such that its precondition is satisfied in the current state and its post condition implies a given goal statement. Combined with a knowledge base this mechanism can serve as the basis for the development of sophisticated environments that provide automated support for many software development and life-cycle activities. |
[16] | Richard Jullig, Peter Ladkin, Li-Mei Gilham, and Wolfgang Polak. KBSA project management assistant final technical report (2 vols). Technical Report RADC-TR-87-78, Rome Air Development Center, July 1987. [ bib ] |
[17] | R. Jullig, W. Polak, P. Ladkin, and L. Gilham. Knowledge-based software project management. Technical Report KES.U.87.3, Kestrel Institute, 1987. [ bib ] |
[18] | W. Polak. A technique for defining predicate transformers. Technical Report 17-4, Odyssey Research Associates, Ithaca, NY, October, 1988. [ bib ] |
[19] | W. Polak. Predicate transformer semantics for Ada. Technical Report 89-39, Odyssey Research Associates, September, 1989. [ bib ] |
[20] |
David Guaspari, Carla Marceau, and Wolfgang Polak.
Formal verification of Ada programs.
IEEE Transactions on Software Engineering, 16(9):1058--1075,
September 1990.
[ bib |
http ]
The Penelope verification editor and its formal basis are described. Penelope is a prototype system for the interactive development and verification of programs that are written in a rich subset of sequential Ada. Because it generates verification conditions incrementally, Penelope can be used to develop a program and its correctness proof in concert. If an already-verified program is modified, one can attempt to prove the modified version by replaying and modifying the original sequence of proof steps. Verification conditions are generated by predicate transformers whose logical soundness can be proven by establishing a precise formal connection between predicate transformation and denotational definitions in the style of continuation semantics. Penelope's specification language, Larch/Ada, belongs to the family of Larch interface languages. It scales up properly, in the sense that one can demonstrate the soundness of decomposing an implementation hierarchically and reasoning locally about the implementation of each node in the hierarchy. |
[21] | H. Graves and W. Polak. Common intermediate design language overview. Lockheed Palo Alto Research Laboratory, 1991. [ bib ] |
[22] | David Guaspari, Carla Marceau, and Wolfgang Polak. Formal verification of Ada programs. In Ursala Martin and Jeanete M. Wing, editors, First International Workshop on Larch, Dedham 1992, pages 104--141. Springer-Verlag, 1992. [ bib ] |
[23] |
W.H. Graves and W. Polak.
Common intermediate design language.
In Hawaii International Conference on System Sciences, January
1992.
[ bib |
http ]
The Common Intermediate Design Language (CIDL) is a high-level executable system design language intended for the evolutionary prototyping of large distributed software systems. CIDL was designed to be the target language for code synthesis from high-level system descriptions and the source language for translation in to Ada. The resulting design is a typed language with higher-order functions, polymorphism, and concurrency constructs. The language uses types as a representation of formal specifications. |
[24] |
Carla Marceau and Wolfgang Polak.
Modular verification of Ada library units.
In Seventh Annual Conference on Computer Assurance,
Gaithersburg, June 1992.
[ bib |
http ]
Modular verification of Ada library units enables programmers to write and verify small program units and to compose them with minimal additional effort into larger correct programs. Penelope is a prototype verification environment for Ada that supports separate verification of program units and their composition. The authors have extended Penelope to enable verification of larger Ada programs, consisting of multiple compilation units. They discuss two major issues that arise from the composition of program modules. The first is ensuring that the composition itself is correct, that is, that assumptions made by one module about another indeed hold. The elaboration of Ada packages poses new problems, which are described along with the solution adopted. A novel technique for reducing the amount of annotation required from the user is described. The second issue is maintaining consistency between the various modules without incurring excessive overhead. The way in which a set of modules is kept consistent depends on the structure of the language. The method, called separate verification, is closely modeled on the technique of separate compilation used in Ada. How Ada techniques can be adapted for a verification environment is discussed |
[25] | David Guaspari, John McHugh, Wolfgang Polak, and Mark Saaltink. Towards a formal semantics for Ada 9X. NASA Contractor Report 195037, March 1995. [ bib | .pdf ] |
[26] | Wolfgang Polak, Lester D. Nelson, and Timothy W. Bickmore. Reengineering IMS databases to relational systems. In 7th annual Software Technology Conference, Salt Lake City, April 1995. published on CDROM. [ bib ] |
[27] |
Eleanor Rieffel and Wolfgang Polak.
Introduction to quantum computing for non-physicists.
Technical Report TR-98-044, FX Palo Alto Laboratory, August 1998.
[ bib |
http ]
Richard Feynman's observation that quantum mechanical effects could not be simulated efficiently on a computer led to speculation that computation in general could be done more efficiently if it used quantum effects. This speculation appeared justified when Peter Shor described a polynomial time quantum algorithm for factoring integers. In quantum systems, the computational space increases exponentially with the size of the system which enables exponential parallelism. This parallelism could lead to exponentially faster quantum algorithms than possible classically. The catch is that accessing the results, which requires measurement, proves tricky and requires new non-traditional programming techniques. The aim of this paper is to guide computer scientists and other non-physicists through the conceptual and notational barriers that separate quantum computing from conventional computing. We introduce basic principles of quantum mechanics to explain where the power of quantum computers comes from and why it is difficult to harness. We describe quantum cryptography, teleportation, and dense coding. Various approaches to harnessing the power of quantum parallelism are explained, including Shor's algorithm, Grover's algorithm, and Hogg's algorithms. We conclude with a discussion of quantum error correction. |
[28] | Wolfgang Polak. Formal methods in practice. In Proceedings of The 1998 ARO/ONR/NSF/DARPA Monterey Workshop on Engineering Automation for Computer Based Systems, number NPS-CS-99-00 in Naval Postgraduate School, Monterey, California, 1998. [ bib ] |
[29] |
Tad Hogg, Carlos Mochon, Wolfgang Polak, and Eleanor Rieffel.
Tools for quantum algorithms.
International Journal of Modern Physics C, 10(7):1347--1361,
1999.
[ bib |
http ]
We present efficient implementations of a number of operations for quantum computers. These include controlled phase adjustments of the amplitudes in a superposition, permutations, approximations of transformations and generalizations of the phase adjustments to block matrix transformations. These operations generalize those used in proposed quantum search algorithms. |
[30] |
Eleanor Rieffel and Wolfgang Polak.
An introduction to quantum computing for non-physicists.
ACM Computing Surveys, 32(3):300--335, September 2000.
[ bib |
http ]
Richard Feynman's observation that quantum mechanical effects could not be simulated efficiently on a computer led to speculation that computation in general could be done more efficiently if it used quantum effects. This speculation appeared justified when Peter Shor described a polynomial time quantum algorithm for factoring integers. In quantum systems, the computational space increases exponentially with the size of the system which enables exponential parallelism. This parallelism could lead to exponentially faster quantum algorithms than possible classically. The catch is that accessing the results, which requires measurement, proves tricky and requires new non-traditional programming techniques. The aim of this paper is to guide computer scientists and other non-physicists through the conceptual and notational barriers that separate quantum computing from conventional computing. We introduce basic principles of quantum mechanics to explain where the power of quantum computers comes from and why it is difficult to harness. We describe quantum cryptography, teleportation, and dense coding. Various approaches to harnessing the power of quantum parallelism are explained, including Shor's algorithm, Grover's algorithm, and Hogg's algorithms. We conclude with a discussion of quantum error correction. |
[31] | Eleanor Rieffel and Wolfgang Polak. Introduction to quantum computing (in Russian). Quantum Computers and Computing, 1(1):4--57, 2000. R&C Dynamics, Moscow. [ bib | .pdf ] |
[32] |
Patrick Chiu, Andreas Girgensohn, Wolfgang Polak, Eleanor Rieffel, Lynn Wilcox,
and Forrest H. Bennett III.
A genetic segmentation algorithm for image data streams and video.
In Genetic and Evolutionary Computation Conference, Las Vegas,
NV, 2000.
[ bib |
.pdf ]
We describe a genetic segmentation algorithm for image data streams and video. This algorithm operates on segments of a string representation. It is similar to both classical genetic algorithms that operate on bits of a string and genetic grouping algorithms that operate on subsets of a set. It employs a segment fair crossover operation. For evaluating segmentations, we define similarity adjacency functions, which are extremely expensive to optimize with traditional methods. The evolutionary nature of genetic algorithms offers a further advantage by enabling incremental segmentation. Applications include browsing and summarizing video and collections of visually rich documents, plus a way of adapting to user access patterns. |
[33] |
Patrick Chiu, Andreas Girgensohn, Wolfgang Polak, Eleanor Rieffel, and Lynn
Wilcox.
A genetic algorithm for video segmentation and summarization.
In IEEE International Conference on Multimedia and Expo, New
York, NY, 2000. IEEE.
[ bib |
.pdf ]
We describe a genetic segmentation algorithm for video. This algorithm operates on segments of a string representation. It is similar to both classical genetic algorithms that operate on bits of a string and genetic grouping algorithms that operate on subsets of a set. For evaluating segmentations, we define similarity adjacency functions, which are extremely expensive to optimize with traditional methods. The evolutionary nature of genetic algorithms offers a further advantage by enabling incremental segmentation. Applications include video summarization and indexing for browsing, plus adapting to user access patterns. |
[34] |
Wolfgang Polak.
Method and System for Constructing Adaptive and Resilient
Software.
US Patent 6,226,627 B1, May 2001.
[ bib |
http ]
A dependency action system uses redundant sets of dynamically reconfigurable functional components to achieve robustness and fault tolerance, and to achieve self-optimization by learning and planning techniques that use time-stamps and or computation stamps as a key indicator. The dependency action system is based on functional components, or actions, which act on data values that are stored in stamped storage locations. Data is read and written to these storage locations, updating the stamps as appropriate. The execution of an action is controlled by the stamps of its enabling and disabling storage locations. The dependency action system specifies an action as enabled if new data has arrived in the enabling storage locations. Updating the stamp of the disabling storage locations disables the action. If an alternative action succeeds and produces a value, the other alternative actions become disabled. If one action fails to produce a value to a storage location, other alternative actions may still be enabled and can be executed. Thus, the dependency action system supports automatic recovery from failure of an individual action. The dependency action system accumulates statistical information about the behavior of the actions, which includes the probability that a particular disabling storage location will be updated by an action and the average cost of an action. The dependency action system uses this information to plan a sequence of action executions that most likely leads to the cheapest solution of a given task. |
[35] |
Wolfgang Polak.
Formal methods in practice.
Science of Computer Programming, 42(1):75--85, Jan 2002.
[ bib ]
Technology transfer from academic research to industrial practice is hampered by social, political, and economic problems more than by technical issues. This paper describes one instance of successful technology transfer based on a special-purpose language and an associated translation tool tailored to the customer's needs. The key lesson to be learned from this example is that mathematical formalisms must be transparent to the user. Formalisms can be effectively employed if they are represented by tools that fit into existing work processes. It is suggested that special-purpose, domain-specific languages and their translators are an important vehicle to transition advanced technology to practice. This approach enables domain experts to solve problems using familiar terminology. It enables engineers of all disciplines to utilize computers without becoming software engineers. In doing so, we not only mitigate the chronic shortage of qualified software personnel but also simplify the problem of requirements analysis and specification. |
[36] |
Brian Arthur and Wolfgang Polak.
The evolution of technology within a simple computer model.
Technical report, Santa Fe Institute, December 2004.
[ bib |
.pdf ]
New technologies are constructed from technologies that previously exist, and in turn these new technologies offer themselves as possible components---building blocks---for the construction of further new technologies. Thus in 1912 the amplifier circuit was constructed from the already existing triode vacuum tube in combination with other circuit components. It in turn made possible the oscillator and the heterodyne mixer; which together with other standard components made possible continuous-wave radio transmission. The collective of technology in this way forms a network of elements where novel elements are created from existing ones and where more complicated elements evolve from simpler ones. We model this phenomenon of technology creating itself by setting up an artificial system in which new elements (logic circuits) build themselves from simpler existing elements, and we study the properties of the resulting build-out. |
[37] | Patrick Chiu, Andreas Girgensohn, Surapong Lertsithichai, Wolfgang Polak, and Frank Shipman. Systems and methods for creating an interactive 3d visualization of indexed media. US Patent, US 20080104546 A1, 2008. [ bib ] |
[38] |
Patrick Chiu, Andreas Girgensohn, Wolfgang Polak, Eleanor Rieffel, Lynn Wilcox,
and Forrest Bennett.
A genetic algorithm for summarizing image document streams and
video key frame selection.
US Patent 6,819,795, November 2004.
[ bib |
http ]
A method, information system, and computer-readable medium is provided for segmenting a plurality of data, such as multimedia data, and in particular an image document stream. Segment boundary points may be used for retrieving and/or browsing the plurality of data. Similarly, segment boundary points may be used to summarize the plurality of data. Examples of image document streams include video, PowerPoint slides, and NoteLook pages. A genetic method having a fitness or evaluation function using information retrieval concepts, such as importance and precedence, is used to obtain segment boundary points. The genetic method is able to evaluate a large amount of data in a cost effective manner. The genetic method is also able to run incrementally on streaming video and adapt to usage patterns by considering frequently accessed images. |
[39] |
Patrick Chiu, Andreas Girgensohn, Surapong Lertsithichai, Wolfgang Polak, and
Frank Shipman.
Mediametro: Browsing multimedia document collections with a 3d city
metaphor.
In ACM Multimedia 2005, Technical Demonstrations, November
2006.
[ bib |
.pdf ]
The MediaMetro application provides an interactive 3D visualization of multimedia document collections using a city metaphor. The directories are mapped to city layouts using algorithms similar to treemaps. Each multimedia document is represented by a building and visual summaries of the different constituent media types are rendered onto the sides of the building. From videos, Manga storyboards with keyframe images are created and shown on the façade; from slides and text, thumbnail images are produced and subsampled for display on the building sides. The images resemble windows on a building and can be selected for media playback. To support more facile navigation between high overviews and low detail views, a novel swooping technique was developed that combines altitude and tilt changes with zeroing in on a target. |
[40] | Wolfgang Polak and W. Brian Arthur. System and Method for Automatic Design of Components in Libraries. US Patent 7,711,674 B2, May 2007. [ bib ] |
[41] |
W. Brian Arthur and Wolfgang Polak.
The evolution of technology within a simple computer model.
Complexity, 11(5):23--31, May 2006.
[ bib |
http ]
Technology -- the collection of devices and methods available to human society -- evolves by constructing new devices and methods from ones that previously exist, and in turn offering these as possible components -- building blocks -- for the construction of further new devices and elements. The collective of technology in this way forms a network of elements where novel elements are created from existing ones and where more complicated elements evolve from simpler ones. We model this evolution within a simple artificial system on the computer. The elements in our system are logic circuits. New elements are formed by combination from simpler existing elements (circuits), and if a novel combination satisfies one of a set of needs, it is retained as a building block for further combination. We study the properties of the resulting build out. We find that our artificial system can create complicated technologies (circuits), but only by first creating simpler ones as building blocks. Our results mirror Lenski et al.'s: that complex features can be created in biological evolution only if simpler functions are first favored and act as stepping stones. We also find evidence that the resulting collection of technologies exists at self-organized criticality. |
[42] |
Xiaohua Sun, Patrick Chiu, Jeffrey Huang, Maribeth Back, and Wolfgang Polak.
Implicit brushing and target snapping: Data exploration and
sense-making on large displays.
In Proceedings of AVI. ACM Press, May 2006.
[ bib |
.pdf ]
During grouping tasks for data exploration and sense-making, the criteria are normally not well-defined. When users are bringing together data objects thought to be similar in some way, implicit brushing continually detects for groups on the freeform workspace, analyzes the groups2̆019 text content or metadata, and draws attention to related data by displaying visual hints and animation. This provides helpful tips for further grouping, group meaning refinement and structure discovery. The sense-making process is further enhanced by retrieving relevant information from a database or network during the brushing. Closely related to implicit brushing, target snapping provides a useful means to move a data object to one of its related groups on a large display. Natural dynamics and smooth animations also help to prevent distractions and allow users to concentrate on the grouping and thinking tasks. Two different prototype applications, note grouping for brainstorming and photo browsing, demonstrate the general applicability of the technique. |
[43] |
Cheng-Jia Lai and Wolfgang Polak.
A collaborative approach to stochastic load balancing with networked
queues of autonomous service clusters.
In MobCops 2006 Workshop, Atlanta, Georgia, November 2006.
IEEE/ACM.
[ bib |
.pdf ]
Load balancing has been an increasingly important issue for handling computational intensive tasks in a distributed system such as in Grid and cluster computing. In such systems, multiple server instances are installed for handling requests from client applications, and each request (or task) typically needs to stay in a queue before an available server is assigned to process it. In this paper, we propose a high-performance queueing method for implementing a shared queue for collaborative clusters of servers. Each cluster of servers maintains a local queue and queues of different clusters are networked to form a unified (or shared) queue that may dispatch tasks to all available servers. We propose a new randomized algorithm for forwarding requests in an overcrowded local queue to a networked queue based on load information of the local and neighboring clusters. The algorithm achieves both load balancing and locality awareness. |
[44] |
Wolfgang Polak.
A Method for Automatically Generating Provably Correct Code.
US Patent, 7,243,086, 2007.
[ bib |
http ]
A provably correct computer program can be generated using genetic programming techniques. A desired behavior is used to define a formal specification. An initial population of programs is created where each program has a meaning that can be defined using a formalization technique. A fitness function is applied to measure a distance between the meaning of a program, i.e., its actual behavior, and the specification. Any program having a zero value as the measure of distance between the meaning of the program and the specification is determined to be provably correct. After the fitness of some or all of the programs in the current generation of programs has been computed, a provably correct program has not yet been found in the current generation, mutation and/or crossover techniques are performed on at least some of the fittest individuals in the current generation to create the programs of a next generation. |
[45] |
Volker Roth, Wolfgang Polak, Eleanor Rieffel, and Thea Turner.
Simple and effective defense against evil twin access points.
In WiSec '08: Proceedings of the first ACM conference on
Wireless network security, pages 220--235, New York, NY, USA, 2008. ACM.
[ bib |
.pdf ]
Wireless networking is becoming widespread in many public places such as cafes. Unsuspecting users may become victims of attacks based on “evil twin” access points. These rogue access points are operated by criminals in an attempt to launch man-in-the-middle attacks. We present a simple protection mechanism against binding to an evil twin. The mechanism leverages short authentication string protocols for the exchange of cryptographic keys. The short string verification is performed by encoding the short strings as a sequence of colors, rendered sequentially by the user's device and by the designated access point of the cafe. The access point must have a light capable of showing two colors and must be mounted prominently in a position where users can have confidence in its authenticity. We conducted a usability study with patrons in several cafes and participants found our protection mechanism very usable. |
[46] | D.M. Hilbert, D.A. Billsus, J.E. Adcock, W. Polak, L. Denoue, and E.G. Rieffel. Usable and Secure Portable Storage. US Patent 0114990 A1, May 2008. [ bib ] |
[47] |
Patrick Chiu, Jeffrey Huang, Maribeth Back, Nicholas Diakopoulos, John Doherty,
Wolfgang Polak, and Xiaohua Sun.
mtable: browsing photos and videos on a tabletop system.
In Abdulmotaleb El-Saddik, Son Vuong, Carsten Griwodz, Alberto Del
Bimbo, K. Selçuk Candan, and Alejandro Jaimes, editors, Proceedings
of the 16th International Conference on Multimedia 2008, Vancouver, British
Columbia, Canada, October 26-31, 2008, pages 1107--1108. ACM, 2008.
[ bib |
http ]
In this video demo, we present mTable, a multimedia tabletop system for browsing photo and video collections. We have developed a set of applications for visualizing and exploring photos, a board game for labeling photos, and a 3D cityscape metaphor for browsing videos. The system is suitable for use in a living room or office lounge, and can support multiple displays by visualizing the collections on the tabletop and showing full-size images and videos on another flat panel display in the room. |
[48] | Eleanor Rieffel and Wolfgang Polak. Quantum Computing, a Gentle Introduction. MIT Press, Cambridge, MA, 2011. [ bib | http ] |
[49] |
Kazumasa Murai, Patrick Chiu, Wolfgang Polak, Andreas Girgensohn, and Qiong
Liu.
Annealing Algorithm for Non-Rectangular shaped Stained Glass
Collages.
US Patent 8,144,919 B2, Mar 2012.
[ bib |
http ]
The present invention relates to a method to make effective use of non rectangular display space for displaying a collage. In an embodiment of the invention, a heterogeneous set of images can be arranged to display the region of interest of the images to avoid overlapping regions of interest. The background gaps between the regions of interest can be filled by extending the regions of interest using a Voronoi technique. This produces a stained class effect for the collage. In an embodiment of the present invention, the technique can be applied to irregular shapes including circular shapes with a hole in the middle. In an embodiment of the present invention, the technique can be used to print labels for disks. |
[50] | Volker Roth, Wolfgang Polak, and Eleanor Rieffel. System and Method for Human Assisted Secure Information Exchange. US Patent 8,429,405 B2, Apr 2013. [ bib ] |
[51] | Sanjeev Kaushal, Kenji Sugishima, Sukesh Janubhai Patel, Robert Filman, Wolfgang Polak, Orion Wolfe, and Jessie Burger. Biologically based chamber matching. US Patent 8,723,869 B2, May 2014. [ bib ] |
[52] | Sanjeev Kaushal, Sukesh Janubhai Patel, Wolfgang Polak, Aaron Archer Waterman, and Orion Wolfe. Method and apparatus for autonomous tool parameter impact identification system for semiconductor manufacturing. US Patent 9746849, August 2017. [ bib ] |
[53] | Sanjeev Kaushal, Sukesh Janubhai Patel, Wolfgang Polak, and Orion Wolfe. Tool failure analysis using space-distorted similarity. US Patent 10228678, March 2019. [ bib ] |
[54] | Sanjeev Kaushal, Sukesh Janubhai Patel, Wolfgang Polak, and Orion Wolfe. Method and apparatus for autonomous tool parameter impact identification system for semiconductor manufacturing. US Patent 10571900, February 2020. [ bib ] |
This file was generated by bibtex2html 1.99.