Department of Computer Science | Institute of Theoretical Computer Science | CADMO
Prof. Emo Welzl and Prof. Bernd Gärtner
(Financed by the Swiss National Science Foundation). A unique sink orientation (USO) is an orientation of the n-dimensional hypercube graph, with the property that every face (subcube) has a unique sink. In particular, there is a unique global sink.
The algorithmic problem associated with a USO is that of finding the global sink. A vertex evaluation oracle can be asked for the orientations of all n edges incident to a given vertex. The complexity of a (randomized) sink-finding algorithm is the (expected) number of oracle queries it needs in the worst case to find the global sink of an n-dimensional USO. It is unknown whether the sink can be found with polynomially (in n) many oracle queries, but there are also no hardness results.
USOs have been introduced in the context of linear complementarity problems by Stickney and Watson, and later as combinatorial objects by Szabó and Welzl. Other optimization problems have also been shown to give rise to USOs ,most notably linear programming (LP), but also quadratic programming and more general convex optimization problems such as finding the smallest enclosing ball of a set of points, or a set of balls.
Being able to find the sink in polynomial time would answer two major open questions in optimization theory: (i) is there a strongly polynomial-time algorithm for linear programming? This question is on Smale's 1998 list of mathematical problems for the next century; (ii) is there a polynomial-time algorithm for P-matrix linear complementarity problems? These problems are well-studied and known to fall into a number of (recent) complexity classes, but hardness results are not known, either.
The fact that the aforementioned open questions are long-standing indicates that the goal of polynomial-time sink-finding in USOs is not very realistic. Despite this, USOs have been studied by many researchers (including the applicant) over the past twenty years, since they provide a clean and simple combinatorial framework in which many other relevant questions can be asked (and actually answered). These questions often impact or relate to concrete questions in optimization theory. For example, the currently best deterministic algorithm for P-matrix linear complementarity problems is USO-based. Notwithstanding the research interest in sink-finding in USOs, most of the initial algorithmic results of Szabó and Welzl have still not been improved significantly.
Over the last three years, many new but somewhat "isolated" insights on USOs have been obtained, although with some surprising connections showing up here and there. Much of this work was done in joint research of the applicant with students, in the context of Bachelor's, Master's and PhD theses. The goal of this project is to unify and extend this research, systematically explore promising directions that have so far only been touched, and eventually make progress also on the algorithmic problem. As Szabó and Welzl (lightheartedly, in hindsight) put it in 2001, "there is ample space for improvement". During the project, we want to explore this space.
Contact: B. Gärtner
Duration: Oct 2021 – Feb 2025.
(Financed by the Swiss National Science Foundation).
Arrangements and drawings are two fundamental concepts that play a
central role in discrete and computational geometry: Given a set L of
planar curves, the arrangement of L is the subdivision of the plane
into faces, edges, and vertices, each defined as a maximal connected
component contained in the intersection of 0, 1, or 2 elements in L,
respectively. Given a graph G, a planar drawing of G assigns a point
in the plane to each vertex of G and represents the edges of G by
simple Jordan arcs that connect the corresponding
endpoints.
Arrangements and drawings pop up in almost every corner of discrete
and computational geometry. For example, they can be used to encode
order relationships between points, to compute convex hulls, or to
find ham-sandwich cuts. Due to the fundamental nature and the
simplicity of the concepts, a vast number of variants, special cases,
generalizations, and abstractions have been developed. The goal of
this project is to undertake a systematic study of the different
structures and their representations, and of their relevant
combinatorial and algorithmic properties.
The project will be carried out as an international collaboration with
partners from Berlin and Graz.
Contact: M. Hoffmann
Duration: Jan 2018 – Sep 2022.
(Hasler-Stiftung, D–INFK.) Ein grundlegendes informatisches Wissen ist in unserer Informationsgesellschaft für fast alle Berufe unabdingbar. Um Informationstechnologie nicht nur konsumieren, sondern mitgestalten zu können, sind fundierte Informatikkenntnisse erforderlich. Gleichzeitig steckt die wirkliche Informatik als Fach in den allgemeinbildenden Schulen noch in den Kinderschuhen. Es gibt daher einen grossen Bedarf nach mehr informatischer Bildung in der Schule.
Seit 2015 hat das Kinderlabor im Pilotprojekt Programmieren von klein auf an der ETH ein hochwertiges Bildungsangebot aufgebaut, um Kindern zwischen 4 und 8 Jahren eine altersgerechte Einführung in die Informatik zu ermöglichen. Strategisches Ziel ist es, dazu beizutragen, die Informatik als allgemeinbildendes Fach zu etablieren, das vom Kindergarten bis zur Matur gelehrt wird und dabei Jungen und Mädchen gleichermassen anspricht. Die Rückmeldungen der Lehrpersonen aus dem Pilotprojekt zeigen erfreulich gute Resultate und ein hohes Potenzial, das aber erst noch ausgeschöpft werden muss. Ziele des geplanten Projekts sind deshalb die folgenden:
Damit wird erreicht, dass Mädchen und Jungen schweizweit früh mit Informatik bekannt gemacht werden und die Chance erhalten, ihre Begeisterung für dieses Fach zu entdecken. Lehrpersonen erfahren die Informatik als spannende Disziplin, erhalten einfachen Zugang zu grundlegenden informatischen Denkweisen und können diese an die Kinder weitergeben. Wirtschaft und Hochschulen in der gesamten Schweiz profitieren langfristig von besser qualifizierten Schulabsolventen.
Contact: B. Gärtner.
Duration: Jun 2017 – May 2020
(Erwin Schrödinger Fellowship, Financed by the Austrian Science Foundation (FWF).) For problems like counting geometric graphs on point sets, we are usually not interested in the exact position of the points, but how they are placed relative to each other. The “order type” of a point set tells us in which order we encounter three points of the set when walking along the boundary of the triangle defined by these points in counterclockwise direction. In previous work, we defined a relation on order types by comparing the set of crossing-free geometric graphs each order type admits. We plan to obtain relations between order types that provide more insight into the structure of maximizing and minimizing point sets. In particular, we are interested in bounding the number of triangulations. We will attempt to prove a long-standing conjecture stating that the number of triangulations is minimized by a certain order type, using the insights obtained from different relations and local transformations. Apart from bounding the number of geometric graphs, we are interested in relations between different order types in its own right.
Contact: A. Pilz.
Duration: Mar 2016 – Mar 2019.
(ETH Zurich Postdoctoral Fellowship.) The main objective of this project is to discover new fundamental results in the area of computational geometry from a theoretical point of view. We study dynamic data structures and algorithms motivated by the problem of having constantly changing objects in geometric spaces. Preprocessing is crucial to obtain a good query performance, either by independently preprocessing each object, or by a general data structure that captures the proximity information of all the objects (such as Voronoi diagrams). In this setting, we aim to study the following general problem: Given a collection of objects in a geometric space that are changing over time, detect or predict interactions between them or minimize a function defined on them. This general problem depends on several parameters and the result of modifying them is a large collection of related problems that can be tackled with similar sets of tools. If we work with a collection of hyperplanes, we obtain problems like the celebrated dynamic convex hull or dynamic linear programming. If each object is a moving convex polyhedron, we can turn to intersection testing and collision detection. Other variants include fundamental problems such as dynamic Voronoi diagrams and dynamic structures for nearest-neighbors, half-emptiness or linear programming queries. The problems in this collection draw their motivation from geographical information systems, motion planning, robotics, computer graphics or simulation, but are on their own fundamental theoretical questions. While efficient algorithms and data structures exist for some of the problems mentioned above, the vast majority assume that objects are static. In reality however, information is constantly being modified and data structures need to be adapted to deal with this fact. Designing geometric data structures for these problems on constantly changing data is a challenging task and the ultimate goal of this project.
Contact: L. Barba.
Duration: Sep 2016 – Aug 2018.
(Financed by the Department of Computer Science, ETH Zürich, Haslerstiftung , Akademie der Wissenschaften der Schweiz.) Zusammen mit dem Departement für Informatik der ETH Zurich errichtet das Kinderlabor ein hochqualitatives und gleichzeitig allgemein zugänglichen Weiterbildungsangebot in Informatik. Zielgruppe sind Lehrpersonen in Kindergarten und Unterstufe der Primarschule (1. / 2. Klasse), auch ohne Informatik-Vorkenntnisse. In den Kursen lernen die Teilnehmenden die sogenannten Bee-Bots kennen. Das sind kindlich gestaltete Bodenroboter in Bienenform. Sie können mit vier Richtungstasten so programmiert werden, dass sie vorgegebene Wege laufen können. Zusammen mit den Bee-Bots werden passende Unterrichtsmaterialien vorgestellt und eingesetzt. Nach dem Kurs können Lehrpersonen eine „Bee-Bot-Kiste“ für die Dauer von 2 – 4 Wochen ausleihen. Auf Wunsch kommt ein Studierender der Informatik der ETH Zürich in die Klasse, um die Lehrperson beim Einsatz zu unterstützen und zu coachen.
Contact: B. Gärtner.
Duration: Jan 2015 – Dec 2016.
(Financed by the Swiss National Science Foundation.) Understanding and removing redundancy in a given data to improve computational efficiency or discover its fundamental structure is a universal problem in science and engineering, as the data or the mathematical model to be analyzed in any realistic situation often contains redundant information that is implied by the rest. This research project has two intertwined goals, one of them theoretical, the other one driven by a concrete application. On the theoretical level, we want to understand the structure of redundancy and the complexity of redundancy removal in explicitly or implicitly given linear and more abstract systems. Here we strongly build on our expertise in computational geometry as well as combinatorial optimization. On the application side, we want to compute redundancy and assess the role of redundancy in neuromuscular control, a field that is trying to understand how the nervous system is selecting and implementing specific muscle commands that meet the constraints of specific tasks. This part of the project will be performed in close collaboration with Francisco Valero-Cuevas. He pioneered in modeling the interplay of various muscles in terms of zonotopes whose internal structure determines the possible tasks that the muscles under consideration can achieve together.
Contact: B. Gärtner,
K. Fukuda.
Duration: Oct 2013 – Sep 2016.
(Financed by the Swiss National Science Foundation - EuroGIGA/ComPoSe). The goal of this project is the understanding of crossing-free geometric graphs-these are graphs with an embedding on a given planar point set where the edges are drawn as straight line segments without crossings. Often we are restricted to certain types of graphs, most prominently triangulations, but also spanning cycles, spanning trees or (perfect) matchings (and crossing-free partitions), among others. A primary goal is to enumerate, count, or sample graphs of a certain type for a given point set-so these are algorithmic questions-, or to give estimates for the maximum and minimum number of such graphs on any set of n points-these are problems in extremal combinatorial geometry.
Contact: E. Welzl,
M. Wettstein.
Duration: April 2012 - Mar 2015.
(Financed by the Swiss National Science Foundation). We use the term simultaneous (geometric) embedding to refer to any kind of problem in which two or more graphs are to be embedded simultaneously, usually subject to certain constraints. One reason for the strong interest in simultaneous embeddings lies in a wealth of applications in areas such as bio-informatics, network analysis, and software engineering. For instance, in order to compare two phylogenetic trees one would like to have a "nice" simultaneous drawing of two trees sharing the same set of leaves, so-called tanglegrams. More generally, simultaneous embeddings come into play whenever large or dynamically changing graphs are to be visualized. Given a large graph, only part of it can be shown at a time with a sufficient level of detail. When browsing from one such part to a neighboring one, it is desirable to have a certain overlap that is common to both parts and drawn the same way for both parts. In this way, the user can maintain her mental map and more easily keep track of her current location in the graph. In the context of dynamic graphs, a simultaneous drawing for several instances of the graph at different timesteps allows to visually examine the differences in order to, for instance, draw conclusions regarding the capacity utilization of a network. Beyond these motivating applications the study of simultaneous embeddings revealed a rich collection of interesting and challenging combinatorial and algorithmic problems, many of which are still far from being solved. The goal of this project is, on one hand, to gain more insights into some of these problems and, on the other hand, to develop new models in order to address some shortcomings of the existing settings studied so far. The project will be carried out as an international collaboration within the collaborative research project "GraDr: Graph Drawings and Representations", which comprises altogether 12 partners from seven European countries.
Contact: M. Hoffmann, V. Kusters
(Financed by European Commission). High dimensional geometric data are ubiquitous in science and engineering, and thus processing and analyzing them is a core task in these disciplines. The Computational Geometric Learning project (CG Learning) aims at extending the success story of geometric algorithms with guarantees, as achieved in the CGAL library and the related EU funded research projects, to spaces of high dimensions. This is not a straightforward task. For many problems, no efficient algorithms exist that compute the exact solution in high dimensions. This behavior is commonly called the curse of dimensionality. We plan to address the curse of dimensionality by focusing on inherent structure in the data like sparsity or low intrinsic dimension, and by resorting to fast approximation algorithms. The following two kinds of approximation guarantee are particularly desirable: first, the solution approximates an objective better if more time and memory resources are employed (algorithmic guarantee), and second, the approximation gets better when the data become more dense and/or more accurate (learning theoretic guarantee). To lay the foundation of a new field---computational geometric learning---we will follow an approach integrating both theoretical and practical developments, the latter in the form of the construction of a high quality software library and application software.
Contact: B. Gärtner, C. Müller
(Financed by the Swiss National Science Foundation).
The goal of this project is to study higher dimensional analogues of graph
planarity (embeddings of simplicial complexes into Euclidean spaces) both
from an algorithmic and from a combinatorial point of view.
Embeddings of simplicical complexes into Euclidean spaces are a classical
topic in geometric topology (closely related to the classification of
embeddings up to ambient isotopy, the most famous special case of which is
classical know theory). The most basic case of the embeddability problem,
graph planarity, is also a fundamental topic in graph theory, discrete
geometry, and theoretical computer science and has been studied intensively
both from a structural and from an algorithmic point of view (including
close connections to the theory of graph minors and (linear time) algorithms
for planarity testing, to mention just two aspects most relevant to this
project). Equally important in discrete geometry is the study of the
quantitative failure of planarity, i.e., crossing numbers,
and relations to graph parameters such as edge expansion and bisection width.
In this project, we investigate higher-dimensional analogues of these notions
and problems. Typical questions include: For which dimensions k and d is
there an algorithm that decides if a given k-dimensional complex embeds
into d-dimensional Euclidean space? When is the problem computationally
intractable or even undecidable?
What is the maximum complexity (number of simplices, in terms of the
number of vertices, say) of
a simplicial complex that embeds into d-space? What are the connections
between the eigenvalues of the Laplacian of a simplicial complex and its
embeddability properties?
Contact: U. Wagner
(Financed by Google Research Awards). The goal of this research is to study fast approximation algorithms for kernel methods on the one hand, and for approximate matrix factorizations and semi-definite programs on the other hand. Building on the coreset paradigm, we try to contribute to the understanding of sparsity and algorithmic performance on large scale problems of this form, and will also consider several practical applications from machine learning.
Contact: B. Gärtner
(Financed by the Swiss National Science Foundation). The goal of this project is to study the geometric, combinatorial, and algorithmic foundations of support vector machines (SVM). The focus is on techniques that are orthogonal to the techniques used and developed in the machine learning community. The project will be carried out as an (inter)national collaboration, with two partners from Switzerland (ETH Zürich), and one partner from Germany (Friedrich-Schiller-Universität Jena).
Contact: B. Gärtner
(Financed by the Swiss National Science Foundation.)
We propose to study both linear and convex quadratic programming in a more general setting:
by examining the linear complementarity problem with sufficient matrices. Besides providing
a unifying framework for the two problems, linear complementarity also has many direct
applications, e.g. in control theory, finance, algorithmic game theory.
The tools we will use in our research can generally be described as combinatorial abstract
models of optimisation problems. We concentrate on two of them: oriented matroids and unique-sink
orientations.
Several algorithms have been suggested by previous research in this area. For most of them there
are both positive and negative results: they are known to be polynomial on some subclasses of
the problems, but super-polynomial on other, larger classes. For many classes, however, such
analysis is missing (among them are the two we consider the most important, that is, LCP with
P-matrices and with sufficient matrices). At present randomised algorithms appear promising,
and hence we plan to concentrate on their analysis.
We plan to attack the problem from two sides. We aim to find new classes of problems for which
some algorithm runs in polynomial (or expected polynomial) time; and we will search for new
examples of abstract optimisation problems for which known algorithms are slow. This will in
turn reduce the gap between positive and negative results for these algorithms. We believe
that this approach will eventually lead to a strongly polynomial algorithm for the linear
complementarity problem with sufficient matrices.
(Financed by the Swiss National Science Foundation).
The goal of this project is to study combinatorial and algorithmic questions
related to the embeddability of simplicial complexes into Euclidean space and
obstructions thereto.
For graphs, the study of such questions has a long history and is best
exemplified by classical results such as Euler's polyhedral formula,
Kuratowski's characterization of planar graphs in terms of forbidden minors, or
the fact that planarity is testable in linear time.
On the topological side, embeddings of polyhedra and obstructions to their
existence are also a classical and well studied topic in geometric topology.
However, in contrast to the closely related topic of classifying embeddings up
to isotopy, in particular knot theory, which has been studied extensively also
from an algorithmic viewpoint, the computational complexity of the embeddability
problem had, until recently, not been addressed.
The first part of the present project concerns the systematic study of this
complexity question. Together with J. Matousek and Martin Tancer, we have shown
that embeddability of a given simplicial complex is at least NP-hard to test for
a wide range of the parameters (dimension k of the complex and ambient dimension
d), essentially, if they fall outside the metastable range of the
Haefliger-Weber Theorem. Moreover, in some cases the problem is even
algorithmically undecidable. Numerous open problems remain, such as elucidating
the complexity situation within the metastable range.
The second part of the project is concerned with extremal problems for
simplicial complexes and with threshold phenomena for random simplicial
complexes as introduced by Linial and Meshulam, both in the context of
embeddability. A fundamental extremal question is, for instance: How many
triangles can a 2-dimensional simplicial complex contain (in terms of the
number of vertices or the number of edges) if it admits an embedding into
4-space? The corresponding question for random complexes is: If K is a
2-dimensional complex on n vertices where every possible triangle is chosen
independently with probability p, what is the threshold probability p=p(n)
for K being embeddable in 4-space?
Contact: U. Wagner
(Financed by the Swiss National Science Foundation).
The goal of this project is to investigate two basic and interrelated problems
in discrete and computational geometry.
The first part concerns triangulations and crossing-free geometric graphs in
the plane. These structures are omnipresent, for example optimal Euclidean
matchings, spanning trees and tours are crossing-free, triangulations provide
structure to scattered points sets, etc. Moreover, combinatorial structures
relate to this notion, e.g. well-formed parenthesis expressions or binary trees.
While the situation is well-understood for point sets in convex position
(then it amounts to Catalan structures whose investgation goes back to Euler
around 1750), the general setting is much less understood. This concerns
extremal problems, algorithmic counting and enumeration.
The second part of the project concerns the combinatorics partitioning point
sets in the plane by a line into two parts of prescribed sizes.
The question how many such partitions there can be for a point set of a given
size is known as the "k-set problem". Apart from its intrinsic interest as a
combinatorial question, this problem arises in the analysis of a number of
geometric algorithms, and also turns out to be pivotal for numerous other
geometric questions, some of which seem at first quite unrelated.
One focus of this part will be on identities for k-sets and k-facets through
continuous motion, in particular in the 3-dimensional case. Moreover, we plan
to further investigate applications of k-sets to other geometric problems, e.g.
circle and sphere containment problems and to rectilinear crossing numbers.
Contact: U. Wagner
(Financed by the Swiss National Science Foundation.)
Proper coloring of graphs and the chromatic number (the smallest
number of colors which allow a proper coloring) are among the most important concepts
of graph theory. Numerous problems of pure mathematics and theoretical computer science
require the study of proper colorings and even more real-life problems require the calculation
or at least an estimation of the chromatic number. Nevertheless, there is the discouraging
fact that the calculation of the chromatic number of a graph or the task of finding an
optimal proper coloring are both famous intractable problems, even fast approximation is
probably not possible. This is one of our motivations to study relaxations of proper coloring,
because in some theoretical or practical situations a small deviation from proper is still
acceptable, while the problem could become tractable. Another reason for the introduction of
relaxed colorings is that in certain problems the use of the full strength of proper coloring
is an ``overkill''. Often a weaker concept suffices and provides better overall results.
In the project we study a relaxation of proper coloring, which allows
the presence of some small level of conflicts in the color assignment.
Our approach relies heavily on the theory of transversals.
A set of vertices in a partitioned graph is called a transversal if
it contains exactly one vertex from each class. Transversals with
special properties surface in many different problems of combinatorics
and theoretical computer science, when an appropriate auxiliary graph is
constructed and a transversal of a certain kind is sought after. In particular independent
transversals saw numerous applications in the study of list chromatic number,
the satisfiability of boolean formulas, the linear arboricity of graphs and
relaxed colorings of graphs. The quest for transversal results also inspired a great
methodological advance in the field of combinatorics, as beautiful proofs
applying combinatorial topology appeared.
Contact: T. Szabó
(Financed by the Swiss National Science Foundation.) The main topic of this research is developing new algorithms for enumerating combinatorial objects, such as graphs and polytopes. These objects have some difficulty from the aspects of computational complexity, for example no polynomial time algorithm for checking the isomorphism of two graphs is known. However, these enumeration problems have many applications in optimizations, clustering, data mining, and machine learning, thus we need developments of efficient algorithms, in the sense of both theory and practice.
(Financed by the Swiss National Science Foundation.)
Manifold learning is the problem of computing a model of a k-dimensional manifold
embedded non-linearly in d-dimensional Euclidean space only from a finite set of
sample points. Typically k is very small compared to d. The importance of
the problem can be stressed by its many applications, e.g. in speech recognition, weather
forecasting and economic prediction.
Special cases of the manifold learning problem in low dimensions received a lot of
attention in recent years. Most prominent is the surface reconstruction problem where
a two-dimensional manifold embedded in three-dimensional space has to be learned.
Some of the proposed algorithms for surface reconstruction come with the guarantee
that the computed model is a manifold homeomorphic and geometrically close to the
unknown manifold provided some sampling condition is fulfilled. Almost all known
provable algorithms for surface reconstruction use the Delaunay triangulation of the
sample points as basic data structure. Unfortunately the time to compute the Delaunay
triangulation is prohibitive in higher dimensions.
In this project we plan to investigate provable methods for manifold learning in
high dimensions. Starting point should be a sampling condition similar to the
one used in the proofs for surface reconstruction. The most important task in
extending theoretical guarantees known from surface reconstruction to higher
dimensions is to replace the Delaunay triangulation with other proximity data structures.
Contact: J. Giesen
(Financed by the Swiss National Science Foundation.)
Linear programming, as well as several other geometric optimization problems
do have polynomial-time algorithms in the bit model, i.e. when the input size
is measured by the bit complexity of the actual numbers in the data. It
is not known, however, whether there are strongly polynomial algorithms,
whose performance is quantified in terms of arithmetic operations on
input numbers, independently from their bit complexity.
The goal of the project is to find, study, and algorithmically
exploit combinatorial models for certain discrete geometric optimization
problems. Our main motivation is to enhance our understanding of
the combinatorics behind linear programming and related - mostly
geometric - optimization problems, improve on the performance
guarantees of existing algorithms, and possibly discover novel,
more efficient approaches. Classical such models are matroids,
oriented matroids and submodular functions.
Within the last ten years, the concept of LP-type problems has
become a recognized combinatorial model for linear programming and many
other important geometric optimization problems. For many
problems in the class, the currently best (or even optimal) combinatorial
algorithm is an algorithm for general LP-type problems.
In the last couple of years new exciting
combinatorial models for geometric optimization problems
have emerged, such as unique-sink orientations of cubes and
strong LP-type problems. Just like LP-type problems ten years
ago, these new combinatorial abstractions had immediate consequences;
they provide the only available algorithms for certain linear complementarity
problems with nontrivial running time guarantees.
Within this new project, we plan to explore unique-sink orientations,
strong LP-type problems, their relations to each other and
to other known combinatorial models, as well as algorithms for problems fitting
into the respective models.
Contact: B. Gärtner, T. Szabó
(European Project.)
The project intends to revisit the field of computational geometry
in order to understand how structures that are well-known for linear
objects behave when defined on curves and surfaces. We will consider
the main geometric structures like Voroni diagrams, arrangements and
Minkowski sums, study their mathematical properties, and design
algorithms to construct such structures. To ensure the effectiveness
of our algorithms, we will precisely specify what are the basic numerical
primitives that need to be performed, and consider tradeoffs between
the complexity of the algorithms (i.e. the number of primitive calls)
and the complexity of the primitives and their numerical stability.
Working out carefully the algebraic and robustness issues are two
central objectives of the project. Another major objective is to
integrate the various techniques developed in the project (e.g.
algebra, arithmetics), to compare experimentally various approaches
(e.g. exact versus approximate), and to provide complete solutions
for some fundamental problems.
At site Zurich we focus on surface reconstruction and several
aspects of optimization within the project.
(Coordinator: J. Giesen)
(Financed by ETH Zurich and German Research Society, DFG.
Starting January 1, 2003, at ETH only running scholarships are
funded up to the end of the originally planed time span, but no other activities.)
Successful scientific interaction between discrete mathematics,
algorithms, and application areas in computer science is
increasing. In particular, combinatorics and discrete geometry
provide the foundations for algorithms in general, and for
combinatorial optimization and computational geometry in
particular.
These fields, in turn, either have direct applications or provide
algorithmic concepts for many application areas such as
optimization, computer graphics and vision, geographic information
systems, robotics, molecular modeling and computer aided design.
This initiative focuses on educating graduate students in the
area where these fields interact. The Pre-Doc Program
constitutes a one-semester exposure to an active research
environment with dedicated research-oriented courses.
The Graduate Program is a joint initiative of the departments
Computer Science, Mathematics and Electrical Engineering at ETH Zurich
with the three
universities in Berlin (and a number of other partner institutions).
The goal is to establish international contacts in both research
and education, and to create a focused program with completion
of a Ph.D. thesis within two and a half years.
(Contact: H. Alt, Berlin, and E. Welzl, ETHZ.)
Finanziert durch die Europäische Gemeinschaft
bzw. das Bundesamt für Bildung und Wissenschaft im Rahmen des ESPRIT
IV Long Term Research Program 28155 (an der ETH mit der Arbeitsgruppe
von Prof. P. Widmayer).
Förderungszeitraum: 15. November 1998 - 14. Mai 2000
Projektleiter: E. Welzl; Mitarbeiter: M. Hoffmann.
Ziel des Projekts ist der weiter Aufbau einer Geometrie-Bibliothek (von
Algorithmen), die Implementierung von geometrischen Algorithmen, sowie
allgemeine Untersuchungen solcher Implementierungen. Der Schweizer
Projektteil befasst sich vor allem mit Algorithmen zur geometrischen
Optimierung (B-3) und geometrischen Suchstruckturen (B-4).
Finanziert durch den Schweizerischen
Nationalfonds zur Förderung der wissenschaftlichen Forschung.
Förderungszeitraum: 2 Jahre, ab 1. Oktober 1997.
Projektleiter: B. Gärtner. Mitarbeiter: N.N.
Das Projekt beschäftigt sich mit beweisbar effizienten
Algorithmen für das Problem der linearen Programmierung und
anderer, verwandter Optimierungsprobleme. In der algorithmischen
Geometrie treten eine Reihe solcher Probleme auf (z.B. kleinste Kugel
um eine gegebene Punktmenge, Abstand zweier Polytope). Diese Probleme
können in einem abstrakten Rahmen formuliert und gelöst werden, wobei
die resultierenden Algorithmen ähnlich funktionieren wie der Simplex-
Algorithmus für die lineare Programmierung.
Die theoretischen Fragen, die sich stellen, betreffen insbesondere
die Komplexität simplexartiger Verfahren. Offen ist, ob es polynomielle
Schranken gibt. Die besten bekannten Resultate geben immerhin
subexponentielle Laufzeiten für eine randomisierte Variante an. Andere
natürliche randomisierte Varianten konnten bisher nicht analysiert
werden. Ziel des Projekts ist es, einerseits die bekannten Schranken
zu verbesseren, andererseits aber auch neue obere und untere Schranken
für die Laufzeit verschiedener randomisierter Verfahren zu finden.
Finanziert durch den Schweizerischen Nationalfonds zur
Förderung der wissenschaftlichen Forschung.
Förderungszeitraum: 1. April 1997 - 31. März 1999.
Projektleiter: J. Blömer; Mitarbeiter: B. Trachsler.
Im Projekt wird das Problem betrachtet, hierarchisch codierte
Nachrichten über Netzwerke zu senden. Hierbei wird angenommen, dass
die Netze beliebige Teile der Nachricht verlieren können.
Dies ist z.B. beim Internet der Fall.
Priority Encoding Transmission (PET) erlaubt es dem
Benutzer, verschiedene Prioritäten den Teilen einer hierarchisch
codierten Nachricht zu geben. Bei einer mit PET codierten Nachricht
können die Teile der Nachricht stets in Reihenfolge ihrer
Priorität decodiert werden. Eine erste Implementierung von PET
wurde in ein Videokonferenztool integriert, das den
Videokompressionsstandard MPEG benutzt. Die verschiedenen Prioritäten
werden hierbei gemäss den verschiedenen Frametypen von MPEG
zugeordnet. Da MPEG die diskrete Kosinustransformation benutzt,
existieren in MPEG allerdings auch intraframe Abhängigkeiten. Im
Projekt soll untersucht werden, wie diese Abhängigkeiten bei der
Vergabe von Prioritäten ausgenutzt werden können.
Die wichtigsten Hilfsmittel bei PET sind sogenannte ausfalltolerante
Codes. Das zweite Ziel des Projektes ist es, effiziente
ausfalltolerante Codes zu entwerfen. Im günstigsten Fall sind dies
Codes, die in linearer Zeit codiert und decodiert werden können.
Finanziert durch die Europäische Gemeinschaft
bzw. das Bundesamt für Bildung und Wissenschaft im Rahmen des ESPRIT
IV Long Term Research Program 21957 (an der ETH mit den Arbeitsgruppen
von Prof. J. Nievergelt und Prof. P. Widmayer).
Förderungszeitraum: 1. Oktober 1996 - 31. März 1998.
Projektleiter: E. Welzl; Mitarbeiter: M. Hoffmann.
Ziel des Projekts ist der Aufbau einer Geometrie-Bibliothek (von
Algorithmen), die Implementierung von geometrischen Algorithmen, sowie
allgemeine Untersuchungen solcher Implementierungen. Der Schweizer
Projektteil befasst sich vor allem mit Algorithmen für Geographische
Informationssysteme (GIS), geometrische Suchstrukturen, sowie
Visualisierung und Oberflächenrekonstruktion.