Stephen Wolfram was educated at Eton, Oxford, and Caltech, receiving his Ph.D. in theoretical physics in 1979 at the age of 20. His early work in physics and computer science was recognized by a MacArthur award in 1981. In the early 1980s he made a series of now-classic discoveries about systems known as cellular automata, leading to numerous applications in physics, mathematics, computer science, biology, and other fields. In 1986 he founded Wolfram Research, Inc. and began the creation of Mathematica---now the world's leading software system for technical computing and symbolic programming (and tool which made A New Kind of Science possible). Over the past decade Wolfram has divided his time between leadership of his company and pursuit of basic science. The results of Wolfram's fifteen years of work were presented for the first time in his book, A New Kind of Science (May 2002). An instant bestseller, A New Kind of Science constituted international science news and quickly emerged as one of the most-discussed science books in decades.
The shift of communication to the internet, in particular to email, weblogs (blogs), and online communities, presents an opportunity to study the information dynamics of social networks on a large scale. Blogs, now numbering in the millions, are web pages updated using blogging software that makes it easy for authors to share new content online in the form of time-stamped posts.
One can track how a piece of information spreads by observing when it appears on different blogs. The exact route the information takes is not obvious, since most blog authors will not explicitly identify the source of the information when they write about it. Likely routes can be inferred, however, by analyzing timing information, blogs' past entries, and the explicit network of blogs linking to one another through blogrolls or posts. While one can gain insights from observing how information passes from one individual to another, one can also analyze networks to see how easily one can actively navigate them to locate needed information or individuals.
One test of the navigability of a network is the classic small world experiment, where subjects attempt to reach a target individual through their chain of acquaintances. Examining an email network within an organization reveals how individuals are capable of routing messages locally, even though their knowledge of the organization's global social network is limited.
Increasing scene complexity is a major challenge in computer graphics. Games, virtual reality, augmented reality, and other applications create ever-increasing demand for image quality and interactive performance. The traditional graphics pipeline will not be able to meet these demands.
This talk describes a new approach to building the graphics pipeline that takes advantage of the limitations of the human eye. The human eye is sensitive to visual features like edges. Feature-driven graphics explicitly finds and represents these edges to achieve both high quality and high performance. In this talk I will describe recent research on feature-driven graphics for the scenes of the future.
Kavita Bala is an Assistant Professor in the Computer Science Department and Program of Computer Graphics at Cornell University. She specializes in interactive computer graphics, leading several research projects in interactive rendering, global illumination, image-based modeling and texturing, and visualization. In 2005, Bala co-chaired the Eurographics Symposium on Rendering (EGSR); she has also served on numerous programming committees including SIGGRAPH, the Point-based symposium, and Graphics Interface, among others. She is a co-author of the graduate textbook "Advanced Global Illumination".
Bala received her B.Tech. from the Indian Institute of Technology (IIT, Bombay), and her S.M. and Ph.D. from the Massachusetts Institute of Technology (MIT). Before starting as a faculty member she was a Postdoctoral Research Associate at the Program of Computer Graphics at Cornell University.
Businesses invest a great deal of time and expense developing software. But all too often the knowledge and insights gained during the development disappear into the details of the code or at best only exist in documents with slender ties to the actual source code. Another name for this latent value is the intent behind the software why we call this approach Intentional Software.
Intentional Software captures the tremendous latent value that is usually lost in the design and development process and makes it part of the software. Using Intentional Software the domain knowledge is captured, not lost. All stakeholders - programmers, subject matter experts and others - can have their design intent clearly represented in the code. This increases the quality and value of the software, primarily by making it easier to develop, maintain and change.
Magnus Christerson is Vice President Product Management and Marketing of Intentional Software. Prior to joining Intentional Software Corporation, Christerson held multiple positions at Rational Software (acquired by IBM in February 2003) including director of product management and director of product strategy. From 1995 to 2003, Christerson was instrumental in building an industry leadership position for Rational Rose XDE, the leading UML product for visually designing and generating software programs. Christerson joined Rational in 1995 when Rational acquired Objectory, a software startup where he held various software development and product management roles. Christerson is co-author with Ivar Jacobson of Object-Oriented Software Engineering - A Use Case Driven Approach, Addison-Wesley 1992. He holds a M.Sc. in Computer Science and Technology from University of Lund, Sweden.
FPGAs provide customers a fast time to market through device re-programmability. Need a bug fix? Adding a new protocol? The turn-around to implement these on an FPGA is fast and inexpensive compared with design re-spins and new tape-outs required of custom ASIC designs.
FPGAs require software to analyze, compile, and debug a user's design. Altera's core software, Quartus® II, implements many state-of-the-art CAD algorithms to achieve the highest performance on the smallest possible device. Advanced features provide debug and visualization tools for the user, an essential for dealing with millions upon millions of gates. Jason will be giving an overview of Altera, FPGAs, and various CAD algorithms used in Altera's software.
Jason earned his Bachelor's of Science in Computer Science from the University of Illinois, where he was actively involved in ACM@UIUC and co-chaired SIGNet. After graduating, Jason joined Altera where he has been involved in various projects in the areas of Logic Synthesis, Compiler Infrastructure, and Static Timing Analysis. Jason was one of the key engineers behind the compiler infrastructure, and currently manages the Timing Analysis group.
The ever increasing complexity of application specific integrated circuits, as well as the increasing amount of integration of analog blocks, puts many requirements on digital physical design flows. Additional challenges stem form a number of physical effects that need to be accounted for at the 90nm and 65nm process node, coupled with the industry-wide drive to lower power consumption.
This talk will present system-on-chip design flow methodologies using the Cadence Encounter platform, as well as tools and flow techniques to mitigate the physical effects of 90nm and 65nm CMOS processes. Also presented will be the OpenAccess database, which is being developed as an open C++ API in collaboration between EDA companies and the Silicon Integration Initiative. OpenAccess allows for greatly enhanced interoperability between tools, as well as the digital and analog domain.
Johannes Grad studied at the University of Erlangen, Germany and the Technical University of Berlin, Germany before receiving the MSEE degree from Illinois Institute of Technology in 2002. He is currently a PhD candidate at the Illinois Institute of Technology. He has been with Cadence Design Systems since 2003 where he is developing design flows for digital driven physical implementation using the Cadence Encounter platform, as well as tool interoperability using OpenAccess. He has also developed educational material for crete.cadence.com, part of the Cadence University partner program. In 2002, as an intern with Epson R&D, he was working on high-speed analog CMOS design.
Intel Compilers (F95/C++) are used to obtain optimum performance from programs when run on Intel processors. As Intel processors have increasingly incorporated various levels of parallelism for performance, the compilers have evolved to help programmers exploit this parallelism. The talk will describe the many features and their implementation in the compilers to help parallel programming.
Milind Girkar received a B.Tech. degree from the Indian Institute of Technology, Mumbai, an M.Sc. degree from Vanderbilt University, and a Ph.D. degree in Computer Science from the University of Illinois at Urbana- Champaign in Computer Science. Currently, he co-manages the Advanced Development group in the Intel Compiler Lab. Before joining Intel Corp., he worked on an optimizing compiler for the UltraSPARC platform at Sun Microsystems. His e-mail is email@example.com
Quantum Computers that eclipse the performance of conventional digital computers represent the holy grail of many current research goals. Quantum entanglement is basic to their operation and was the subject of a famous debate between Einstein and Bohr. This presentation deals with the questions: What is quantum entanglement and what is quantum information? Is it something entirely new and fantastic or is it just a combination of analog and digital information as we already know it?
The presentation will not supply any definite answer to these questions but will show that the area of quantum information is as interesting and challenging as it was at the time of the Einstein/Bohr debate and that the attempts to build quantum computers may provide at least some insights into the nature of quantum information.
The scientific interests of Karl Hess are in the area of solid state electronics and optoelectronics. He has made major contributions to the simulation of electronic transport (quantum transport) in solids and has received several major awards including membership in the National Academy of Engineering and National Academy of Sciences. His recent interests are in the area of quantum transport, quantum computing and quantum information.
This talk will present the Cell processor, jointly developed by the STI partnership (Sony-Toshiba-IBM). Cell is a non-homogeneous chip multiprocessor intended for general-purpose applications but with a particular emphasis on multimedia performance.
The Cell processor combines a 64bit Power Architecture(TM) core with 8 Synergistic Processors. In many cases delivers more than an order of magnitude more performance than conventional PC processors. Cell achieves this performance and power efficiency improvement by a new division of labor between the Power core and the Synergistic Processors. Cell allows for a wide variety of programming models, a selection of which will be presented in this talk. We will end the talk by discussing some applications that seem to fit the Cell processor particularly well, and by indicating areas of further exploration.
Dr. H. Peter Hofstee is the chief architect of the Cell Synergistic Processor, and Cell chief scientist. He received his PhD in computer science from the California Institute of Technology (Caltech) in 1995, and joined the Caltech faculty in 1995 and 1996 to teach computer science and VLSI. In 1996 he joined the IBM Austin research laboratory where he helped create the first GHz CMOS processor. Between 1997 and 2000 he worked on a number of other high-frequency server processor designs. In 2000 he helped create the concept for Cell and became one of the founding members of the STI (Sony-Toshiba-IBM) design center in the spring of 2001. His current interest focuses on application of the Cell processor beyond the gaming space and on future Cell designs.
Technological advances are frequently promoted as enabling human freedom and creativity, but the resulting applications often serve very different needs. They may work with the best intentions in mind, but scientists and engineers are rarely in control of how their innovations are applied in the marketplace or on the battlefield. One reason for this disengagement is the widespread perception that researchers toil away at technical problems and have no broader concerns. Our experience in academic and corporate research labs suggests that this idea is completely false.
A representative of the Institute for Applied Autonomy will present the organization's technical work and motivations, and highlight alternative approaches to the science and engineering disciplines for anyone that needs to think beyond the lab.
The Institute for Applied Autonomy was founded in 1998 as an anonymous collective of engineers, designers, artists and activists who are united by the cause of individual and collective self-determination. The group's stated mission is to develop technologies that extend the autonomy of human activists in the performance of real-world, public acts of expression. The results have included an ultra-cute robot designed for targeted distribution of subversive literature and a small tele-operated robot designed for high-speed graffiti deployment from a remote location. The project I-See is a web-based navigation service that allows users to avoid surveillance altogether by providing them with the path of least surveillance to their destination. TXTmob is a cellphone text message broadcasting system that allows users to send messages to huge numbers of other users. TXTmob was used as an organizational technology during several large-scale public events.
"Enabling the Online Community Through Vertical PSOTs and Automated Adverse Content Delivery Systems"
The increasing popularity of completely virtual workspaces has given rise to an electric new medium, one constantly defining and redefining its own rules. The Internet has established itself as a new form of communication between communities which otherwise would've never existed. In order to survive and thrive in this relatively young technology, one must precariously balance the need for order between the needs of the community
How are the dynamics of virtual communities defined, particularly amongst the developing societal aspects? This talk will discuss the emergence of such virtual communities, the challenges derived from maintaining order in the midst of electronic chaos, and building self-sustaining online neighborhoods developed through word-of-mouth popularity and original content.
Rich "Lowtax" Kyanka knows nothing about any of this. He failed out of college, barely escaped numerous beatings in high school, and routinely requires third party aid when attempting to use the garage door opener. Rich spends a majority of his day sitting in his basement, insulting teenagers over the Internet and sweating. He has absolutely nothing of value to offer, no provocative insights to lend, and is an utter failure when it comes to public speaking. There is absolutely no chance anybody could learn anything from hearing him speak. Every single individual attending this conference should be ashamed Rich was invited to speak in the same state as them. If you plan on skipping a speech, make it this one, because I'll probably spend the majority of the hour drinking beer and avoiding questions.
In March of this year, AGEIA Technologies announced the world's first hardware accelerator for real-time, interactive physics. Just as 3D graphics chips increase the visual quality of games, AGEIA's aim with the PhysX chip and PhysX SDK is to transform interactivity and dynamism in games to levels well beyond what is possible in today's systems.
Doing physics fast is a technically challenging task that spans interesting problems in parallel algorithms, computational geometry, dynamics, and parallel computer architecture. In this talk, we will provide an overview of AGEIA's PhysX technology. We'll discuss how physics is being used in gaming today, and how it might be used in the future. We'll provide a glimpse into the technology of our first-generation PhysX chip. And of course, we'll have cool demos with lots of things blowing up... er.. we mean physics.
Sanjay Patel is Chief Architect at AGEIA Technologies and Associate Professor of Electrical and Computer Engineering at University of Illinois at Urbana-Champaign. At AGEIA, he oversees the research and development of AGEIA's hardware technology.
Christopher Lamb is a member of the Architecture group at AGEIA Technologies. A recent graduate with a B.S. from the ECE department at UIUC he has been involved in tech ventures from dot-com web startups to co-founding a wireless communications company. He has always had deep interests in high-performance computing, spurred early on when visiting NCSA's Cray X-MP as a youngster
The shocking statement "You have zero privacy anyway, get over it" was made some 6 years ago by one of our more abrasive industry leaders. Technology has advanced considerably since then, enabling even more ways that our day to day activities can be revealed to those with the right connections. Advances in technology are enabling frightening applications such location tracking, political profiling and classification, and even real time streaming face recognition.
Worse still, while technology has advanced, consumer protection has retrogressed. An entire industry has arisen around these technologies, with sufficient lobbying pressure and political opportunism to ensure almost zero regulation. The usage of data obtained from these technologies ranges from the helpful to the undesirable to the outright dangerous. This talk will examine several major technologies that are being used for everything from targeted marketing to total surveillance, and the dangers associated with the collection, mining, and inevitable leakage of this data.
Mike Perry is a double UIUC alumnus, receiving both his undergraduate and Masters degrees from this University. During his stay here, he was heavily involved in the ACM, chairing LUG and later co-founding and co-chairing the security research group SIGMil. He is an engineer (both forward and reverse) by trade and is currently employed as such by Riverbed Technology - a Linux startup that builds network acceleration devices. When he is not reverse engineering and accelerating broken network protocols, he procrastinates the completion of ongoing open source and book projects by researching privacy in the digital age.
The Windows kernel was designed with a number of distinctive characteristics that make it interesting to study as a contrast to UNIX-based operating systems. The Windows kernel is highly asynchronous and preemptively multi-threaded. The architecture is organized around a central facility for managing kernel/user references and access to kernel data structures. The I/O system is modular and extensible. The virtual memory manager separates management of virtual addresses from management of memory objects, and provides low-level access to address-space and physical memory management for large applications. CPU scheduling is based on threads rather than processes, and carefully designed to use per-CPU data structures and lock-free synchronization to improve scalability. This talk will provide an overview of the kernel architecture in Windows, focussing on the the key aspects that distinguish Windows.
Until recently Dave managed kernel development for Windows, including Windows 2003, and early phases of XPSP2 and Vista. Currently working on releasing kernel sources to universities and developing Project OZ, an experimental environment based on the SPACE project at UC Santa Barbara, where Dave earned his PhD in Electrical & Computer Engineering. Dave is a co-author/consultant of a chinese textbook on operating system principles illustrated with Windows, and for the last several years has taught a short course with Prof. Kei Hiraki at University of Tokyo on Windows Internals. Prior to Microsoft, Dave's experience was all focussed on UNIX kernels, including several years as a Vice President at Culler Scientific Systems. Dave's career began in the late 1970s at Burroughs, where he was a computer architect designing hardware and writing microcode for the B1900.
Blake Ross (born June 12, 1985) is a software developer who is known for working on the Mozilla browsers. In particular he started the Mozilla Firefox project with Dave Hyatt, as well as the Spread Firefox project while working as a contractor at the Mozilla Foundation. In 2005, he was nominated for Wired magazine's top Rave Award, Renegade of the Year, opposite Larry Page, Sergey Brin and Jon Stewart.
Ross was hired to work as an intern at Netscape Communications Corporation at the age of 15 while attending Gulliver Preparatory School in Miami, Florida. He graduated in 2003 and is currently attending Stanford University as a member of its graduating class of 2007.
He is currently writing a book: FirefoxisFor Dummies
We are coming off the heels of the largest venture capital investment period followed by one of the largest equity market crashes in US history. Are we healed yet, and what have we learned? Over the last 10 years, Rob has been both an entrepreneur and venture capitalist. He raised over $70 million in venture capital from 1998 to 2000 and built his company from an idea to an IPO. He will share with you his perspectives on:
- Determining if your idea is venture fundable
- Raising venture capital
- Whats really matters in launching and growing your company
Rob Schultz is a Senior Director with IllinoisVENTURES in Champaign. On behalf of IllinoisVENTURES, he currently serves on the boards of iCyt, Mobitrac, Nextumi, RiverGlass, and Tekion. Rob is the Founder and former CEO of DigitalWork, Inc., a small business services company and a Draper Fisher Jurvetson, TL Ventures, Dell Ventures. As CEO, Rob led the company in building an award-winning service, raising over $70 million from top-tier venture firms, leading the Company through an IPO process, hiring over 140 talented employees, acquiring over 10,000 customers and developing partnerships with such leading companies as AOL, Dell, Office Depot, and Wells Fargo. Prior to DigitalWork, Rob was the President of Nequity, a subsidiary of Signet Bank that was acquired in 1997. In 2000, Rob was a Year 2000 Ernst & Young Entrepreneur of the Year finalist. He was named as one of the "100 most important executives, decision-makers, financiers and idea people in the Chicago Internet community" by Crain's Chicago Business and one of the "top 100 people who put the 'new' in Chicago's new economy" by i-Street Reporter.
Rob received an MBA in finance from the University of Chicago and a bachelor's degree in economics from Northwestern University.
For some niche applications such as large-scale data visualization, parallel software ray tracers are already much faster than the most optimized GPU implementations. However, GPU programs are the only viable choice for most interactive applications. There are three clear possibilities for the future of graphics on the desktop. First is a continuation of z-buffer based GPUs. Second is an emergence of interactive ray tracing running on multicore CPUs. Third is ray tracing using custom hardware (ASIC). This talk examines trends in hardware and application data and argues that ray tracing using custom hardware is the likely winner, and outlines the research problems that will need to be overcome for such an outcome.
Peter Shirley holds a Professor position in the School of Computing at the University of Utah. He has a B.A. in physics from Reed College and a Ph.D. in computer science from the University of Illinois at Urbana-Champaign. He spent four years as an assistant professor at Indiana University and two years as a visiting assistant professor at the Cornell Program of Computer Graphics before moving to Utah. His professional interests include interactive and realistic rendering, statistical computing, visualization, and immersive environments.
Learn how digital certificates and encryption enable secure and verifiable communication on the unregulated information super highway. This talk will cover the basics of crypto technology, what it is and how it works, in a very accessible and practical way. You will learn what a digital certificate is, how it interacts with various crypto technologies to form the cornerstone of reliable communication and commerce in one of the most hostile technical environments imaginable.
Come to this talk, it will be fun.
While attending UIUC Chris Sims chaired SigNet, the ACM Special Interest Group for Networking (and Security). While Chris is generally not credited with inventing the internet he was reputed to have used it on several occasions. After graduating in 1998 Chris was hired by FactSet, a leading provider of global financial and economic information, and moved to the San Francisco Bay Area. Chris continues to work as a software engineer for FactSet, use the Internet, and holds a valid driver's license.
Popular computer mythology states that, in order to make our computers more secure, we have to make them less easy to use or less functional or both. Using traditional security strategies, this is true. This is a tragic irony, for security engineering and usability design have the same goal: to ensure that the user is never unpleasantly surprised. Security and usability cannot be traded off, they must be integrated.
This presentation will demonstrate new user friendly security technologies under development at HP. The Polaris desktop overlay makes Windows safe against several important cyberattacks. The Web Calculus enables easy and secure sharing of data across the Web. Both systems eliminate passwords, access control lists, obscure certificates, funny dialog boxes, and other popular hindrances to usability. Both Polaris and Web Calculus are derived from the same radical architecture for secure computing -- the Authorization Based Access Control model, a model of secure computing for those of us who are not computer security gurus.
Marc Stiegler is both a computer science researcher and a science fiction author. His software product DecideRight was the winner of the Software Publisher Association's CODIE award for Best New Business software. His novel Valentina was a finalist for the Hugo award, his novel David's Sling was a finalist for the Prometheus Award, and his nonfiction book Programming Languages for the IBM PC was selected as one of 20 key books on the PC by Byte Magazine. He is the inventor of the foundational techniques currently in use to develop the Polaris cyberdefense system at Hewlett-Packard Laboratories. Polaris is an overlay for the Windows desktop that makes the computer robust against several important categories of cyberattack. Mr. Stiegler is currently a Visiting Scholar at HP.
Michael Welge is the founder and Chief Scientist for RiverGlass, Inc., an Illinois company that develops scalable, flexible software for real-time discovery of knowledge in large, complex, disparate data sources. Mr. Welge also directs the Automated Learning Group at the University of Illinois' National Center for Supercomputing Applications (NCSA), co-directs the National Laboratory for Advanced Data Research, is an adjunct faculty member at the UI Institute for Genomic Biology, and a faculty member at the university's Center for Information Technology.
Mr. Welge has worked in data mining, mathematical modeling, appliedâ artificial intelligence, and stochastic simulation since 1982 and has applied these technologies in a wide range of fields, including manufacturing, environmental engineering, medicine, education policy, fraud detection, customer relationship management, business intelligence, telecommunications, risk management and crisis management. He is a frequent industry lecturer and has presented briefings and courses on data mining to the National Science Foundation, the Department of Defense, and the state of Illinois.
Mr. Welge holds a B.S. in mathematics and computer science from Southern Illinois University and an M.S. in mathematics--under the direction of David Steinberg and Arthur Gardner--from Washington University, St. Louis.
Imagine an enterprise with seven thousand SMP servers (and counting), and thousands of applications serving a multi-billion dollar business. Given an insatiable demand for compute cycles plus intense time to market pressure, how can we meet these business needs and maintain both operational and fiscal efficiency?
This talk will describe how grid computing and utility computing are taking shape within Morgan Stanley. It will provide the business context under which these technologies are being used today and discuss the technical challenges that lie ahead. Much progress has been made in developing a common grid framework for developing and deploying CPU hungry applications; however, there is still a big challenge left in dealing with CPU and other resource "fragmentation". Virtualization is a key emerging technology that can be leveraged to utilize resources more efficiently.
Ik Yoo is Global Head of Enterprise Application Infrastructure (EAI) at Morgan Stanley. The mission of EAI is to explore new technologies, to seek out new standards and best practices. EAI has played a central role in making Morgan Stanley an early adopter of pioneering technologies such as Linux, Service Oriented Architecture and Grid Computing.
Morgan Stanley is a global financial services firm and a market leader in securities, investment management and credit services. With more than 600 offices in 28 countries, Morgan Stanley connects people, ideas and capital to help clients achieve their financial aspirations.