Tuesday, 30.12.2008, 14:30
New Algorithms for Ranking and Dimension Reduction
The study of ranking crosses many disciplines.
Social choice theoreticians have been interested for centuries
in finding a good way to rank a set of candidates. Econometricians
have been asking for decades how people choose from (and more
generally rank) alternative sets. More recently, information retrieval
theoreticians and practitioners have been interested in ranking search
query results. In verification, practitione...
[ Full version ]
Michael Bronstein (Novafora, Inc.)
Tuesday, 30.12.2008, 11:30
Partial matching is probably one of the most challenging problems in shape analysis.
The problem consists of matching similar parts of shapes that are dissimilar on the whole.
Conceptually, two shapes can be considered partially matching if they have significant similar parts,
with the simplest definition of significance being the size of the parts. Thus, partial matching can be
defined as a multcriterion optimization problem trying to simultaneously maximize the similarit...
[ Full version ]
Avinatan Hassidim (M.I.T.)
Sunday, 28.12.2008, 12:00
An important goal in algorithms and communication protocols is getting
the deterministic time complexity, respectively communication complexity,
closer to the best known randomized complexity. In this work, we investigate
the case where instead of one input, the algorithm \ protocol is given multiple
\emph{independent} samples from an \emph{arbitrary unknown} distribution.
We show that in this case a strong derandomization result can be obtained by a
simple argument. O...
[ Full version ]
Orly Alter (Biomedical Engineering, University of Texas at Austin)
Thursday, 25.12.2008, 13:30
DNA microarrays make it possible to record the complete molecular biological signals that guide the progression of cellular processes on genomic scales. I will describe the ability of mathematical models, that were created from these data using matrix and tensor computations, to predict previously unknown biological as well as physical principles, which govern the activities of DNA and RNA.
First, I will describe the use of singular value decomposition to uncover "asymmetric He...
[ Full version ]
Wednesday, 24.12.2008, 13:30
Abstract: The Ramsey number $r_k(s,n)$ is the minimum $N$ such that every
red-blue coloring of the $k$-tuples of an $N$-element set contains either
a red set of size $s$ or a blue set of size $n$, where a set is called red
(blue) if all $k$-tuples from this set are red (blue). The problem of
estimating Ramsey numbers is one of the central problems in modern
Combinatorics and it has received a considerable amount of attention in
the last 80 years.
In this talk we p...
[ Full version ]
Shai Avidan (Adobe Systems Inc.)
Wednesday, 24.12.2008, 11:30
The growth in the number of digital photographs calls for new and improved image editing tools to help users view, navigate and manipulate them. Many of these tasks can be reduced to instances of organizing visual data to satisfy user constraints. In this talk I will present three such tools that help organize visual data at the pixel, patch and image level: Seam Carving, The Patch Transform, and Infinite Images.
Seam Carving attempts to retarget an image to fit a specific displa...
[ Full version ]
Leah Bar (University of Minnesota, USA)
Tuesday, 23.12.2008, 11:30
Many problems in image processing are addressed via the minimization of a cost functional. The most prominent optimization technique is the gradient descent, often used due to its simplicity and applicability where other techniques, e.g., those coming from discrete optimization, can not be used. Yet, gradient descent suffers from a slow convergence, and often to just local minima which highly depends on the condition number of the functional Hessian. Newton-type methods, on the ot...
[ Full version ]
Monday, 22.12.2008, 18:30
We will continue from "Policy syntax" this meeting (after a quick
review of the previous one).
We will also see how to write an SELinux module to keep a certain
application under control.
SELinux is an access permission system, which is running in many recent
distributions (with or without the computer owner's awareness).
Understanding how this system works seems to become a necessity for a
smooth administration of an updated Linux machine.
...
[ Full version ]
Uri Itai (Mathematics, Technion)
Sunday, 21.12.2008, 12:40
Subdivision schemes became popular in recent years in applications such
as computer graphics and CAGD (computer aided geometric design). Their
implementation is rather simple, yet there is a rich mathematical theory
behind them. We will introduce two basic subdivision schemes, (corner
cutting and the 4-point) and present implementation of them in computer
graphics and in animation....
[ Full version ]
Thursday, 18.12.2008, 14:30
Many interesting questions in coding theory revolve around conflicts
(tradeoffs). Among them are a conflict between structure and
randomness, and a conflict between density and symmetry. In some of
the cases the conflicts are helpful in understanding solutions for
questions (e.g. using the structure and randomness conflict we
obtain improved analysis of Reed Muller codes), while in other cases
these conflicts shed light on the inherent difficulty of the
questions.
In the...
[ Full version ]
Doron Betel (Memorial Sloan-Kettering Cancer Center, N.Y.)
Wednesday, 17.12.2008, 13:30
Small regulatory RNAs are a class of non-coding RNAs that are primarily involved in post-transcriptional
gene silencing. The principle members of this growing class are microRNAs and small interfering
RNAs (siRNAs) which guide a silencing complex to an mRNA target by base pairing at the 3 untranslated
region (3UTR).
In recent years, microRNAs have emerged as a major class of regulatory genes central to
a wide range of cellular activities, and their importance i...
[ Full version ]
Shachar Lovett (Weizmann Institute of Science)
Sunday, 14.12.2008, 12:00
We consider private data analysis in the setting in which a trusted and trustworthy curator,
having obtained a large data set containing private information, releases to the public a
"sanitization'' of the data set that simultaneously protects the privacy of the individual
contributors of data and offers utility to the data analyst. We focus on the case where the
process is non-interactive; once the sanitization has been released the original data and the
curator play no...
[ Full version ]
Tal Hassner (CS, The Open University of Israel)
Tuesday, 09.12.2008, 11:30
Recent methods for learning the similarities between images have
presented impressive results on the problem of pair-matching
(same/not-same classification) of face images. In this talk we present
pair-matching results comparing the performance of image descriptor
based methods to the state of the art in same/not-same classification,
obtained on the Labeled Faces in the Wild (LFW) image set. We propose
various contributions, spanning several aspects of automatic ...
[ Full version ]
Ronen Shaltiel (Haifa University)
Sunday, 07.12.2008, 12:00
A function $f$ is $\delta$-hard for some class of circuits if every
circuit in the class errs on a $\delta$-fraction of the inputs. For
various classes of constant depth small size circuits there are explicit
functions that are $\delta$-hard for ``small'' values of $\delta$.
For stronger circuit classes there are many ``hardness amplification
theorems'' that show how to transform any $\delta$-hard function $f$
into a ``harder'' function $g$ that is $(1/2-\epsilon)$-har...
[ Full version ]
Prahladh Harsha (CS, University of Texas at Austin)
Wednesday, 03.12.2008, 12:00
In this reading group, we will study the parallel repetition theorem,
its connection to unique games and tiling in high dimensional spaces.
The parallel repetition theorem [Raz 1998] has been extremely useful
to compute the exact inapproximability of several optimization
problems. Two years ago, Holenstein gave a simplified proof of Raz's
theorem. Since then, there has been a flurry of activity in this area:
understanding the behavior for special type of 2-prover g...
[ Full version ]
Enav Weinreb (Computer Science, Technion)
Sunday, 30.11.2008, 12:00
Consider the ``plain'' multiparty communication complexity model,
where k players $P_1,\dots,P_k$ holding inputs
$x_1,\dots,x_k\in\bit^n$ (respectively) communicate in order to
compute the value $f(x_1,\dots,x_k)$. The main lower bound
technique for such communication problems is that of {\em
partition arguments}: partition the $k$-players into two disjoint
sets of players and find a lower bound for the induced two-party
communication complexity problem. In this paper, we s...
[ Full version ]
Eli Berger (Haifa University)
Wednesday, 26.11.2008, 13:30
The idea of using topology in order to solve combinatorial problems has
been known for several decades, but only recently it started to become an
organized theory. In my talk I will introduce several classical
topological theorems such as Brouwer's fixed point theorem and Sperner's
Lemma and give the basic methods for using them in combinatorial settings.
I will also describe a new approach that may enable us to use the Borsuk
Ulam theorem in similar settings and hopefully o...
[ Full version ]
Dieter van Melkebeek (Univ. of Wisconsin)
Sunday, 23.11.2008, 12:00
Ever since the work of Cook and Levin, satisfiability has been
recognized as a central problem in computational complexity. It is
widely believed to be intractable, and yet till recently even a
linear-time logarithmic-space algorithm was not ruled out.
The last few years have seen considerable progress in time-space lower
bounds for satisfiability and closely related problems on various
models of computation. This talk will describe the state-of-the-art on
...
[ Full version ]
Emmanuel Tannenbaum (Ben Gurion University)
Tuesday, 04.11.2008, 11:30
This talk provides an introduction to quasispecies theory, a framework for modeling evolutionary dynamics that was originally introduced to study problems related to the origin of life. Since the original formulation of the theory by Manfred Eigen in 1971, quasispecies theory has been used to study evolutionary processes in a variety of other contexts, but with a focus on molecular and viral evolution. In recent years, we have worked to steadily develop quasispecies theory into a ...
[ Full version ]
Monday, 06.10.2008, 14:00
We initiate the study of scenarios that combine both online
decision making and interaction between non-cooperative agents. To
this end we introduce the notion of online games that model such
scenarios as non-cooperative games, and lay out the foundations
for studying this model. Roughly speaking, an online game captures
systems in which independent agents serve requests in a common
environment. The requests arrive in an online fashion and each is
designated to be served by a diff...
[ Full version ]
Thursday, 18.09.2008, 11:30
Sensors, signal processing hardware, and algorithms are under
increasing pressure to accommodate ever larger data sets; ever faster
sampling and processing rates; ever lower power consumption; and
radically new sensing modalities. Fortunately, there have been
enormous increases in computational power. This progress has motivated
Compressed Sensing (CS), an emerging field based on the revelation
that a sparse signal can be reconstructed from a small number of
linear measurem...
[ Full version ]
Joab Winkler (CS, University of Sheffield)
Tuesday, 19.08.2008, 09:30
The discussion of the structured condition number and Gauss' method for computing the roots of a
polynomial shows that a sequence of greatest common divisor (GCD) computations of two polynomials
must be performed. This is a classical problem in mathematics that is often solved by Euclid's algorithm.
This algorithm is satisfactory for symbolic computations with exact polynomials, but the computation
of the GCD of inexact polynomials is not trivial because it is ill-posed (not i...
[ Full version ]
Thursday, 07.08.2008, 13:30
See program and invitation
להלן הזמנה ותוכנית הכנס...
[ Full version ]
Yong-Jun Kim (Computer Science, Seoul National University)
Sunday, 03.08.2008, 12:40
In this talk, I would like to present a user interface for drawing
mathematical figures which can be quite complex. The system is based on a
sketch interface especially designed for drawing mathematical figures. I
would also like to present a search interface that finds figures similar to
the one drawn by the user. Given a sketch by the user, the system searches
a database of pre-drawn figures, automatically finds similar figures, and
shows them in a certain sorted order...
[ Full version ]
Asaf Nussbaum (Weizmann Institute of Science)
Sunday, 03.08.2008, 11:00
We study the limits of efficient computation in the context of
constructing random-looking graph distributions that can be used to
emulate huge random graphs over N=2^n vertices. We suggest and analyze a
new criterion of faithful emulation, inspired by the famous 0/1-law of
random graphs. This 0/1-law states that any graph property T holds with
probability converging either to 0 or to 1, whenever T is expressible by
a single formula on graphs.
We consider preservi...
[ Full version ]
Dominik Bartuschat (CS, Erlangen University, Germany)
Wednesday, 30.07.2008, 00:00
The K-SVD algorithm is a powerful tool for image denoising but
computationally quite intensive. In order to accelerate it we
implement the most time consuming parts like sparse coding and
assembling the final denoised image vectorized and in parallel on the
Cell processor. The Cell processor is a heterogeneous multicore system
consisting of a general purpose Power-PC processor unit (PPU) and
several fast co-processors, the synergistic processor elements (SPEs).
We present r...
[ Full version ]
Sasha Gusev (Columbia University)
Thursday, 24.07.2008, 13:30
Identifying and quantifying the sharing of genetic material between individuals within a population is
an important step in accurately using genealogical relationships for disease analysis as well as improving
our understanding of demography. However, exhaustive pairwise analysis which has been successful in
small cohorts cannot keep up with the current torrent of genotype data. We present GERMLINE, a robust
algorithm for identifying pairwise segmental shar...
[ Full version ]
Professor Noga Alon (Tel Aviv University, 2008 Israel Prize recipient)
Wednesday, 23.07.2008, 16:30
Abstract: The discovery that deterministic statements can be proved by
probabilistic reasoning, led already more than fifty years ago to several
striking results in various mathematical disciplines. It soon became clear
that the method, which is now called the probabilistic method, is a very
powerful tool for proving results in Discrete Mathematics.
After some light examples that illustrate the fact that probabilistic
arguments may be counter-intuitive, I will sketch...
[ Full version ]
Sunday, 20.07.2008, 14:30
The lecture topics are:
1) Real Time programming in a nutshell.
2) Programming languages comparison (Real Time related).
3) Java vs. C/C++ with relevant benchmarks.
4) Why Java, Why Java for Real Time?
5) What is the Real Time Java Standard?
6) Sun Real Time Java technology
7) Customers and case studies
6) Real Time Java competitive landscape
For more information please contact David: David.Roitman at Sun.com
...
[ Full version ]
Dana Moshkovitz, Weizmann
Sunday, 20.07.2008, 11:00
We show that the NP-Complete language 3Sat has a PCP
verifier that makes two queries to a proof of almost-linear size
and achieves sub-constant probability of error o(1). The
verifier performs only projection tests, meaning that the answer
to the first query determines at most one accepting answer to the
second query.
Previously, by the parallel repetition theorem, there were PCP
Theorems with two-query projection tests, but only (arbitrarily
small) constant erro...
[ Full version ]
Benjamin Rossman (M.I.T.)
Sunday, 13.07.2008, 11:00
will discuss a recent lower bound of Omega(n^(k/4)) on the size of depth-d
circuits for the k-clique problem on n-node graphs. This improves previous
lower bounds (e.g. Omega(n^(k/89d2)) [P. Beame '90]) by removing dependence
on d from the exponent of n. This result has an interesting corollary in
logic: it implies that the "bounded variable logics" FO1, FO2, ..., FO^k,
... (where FO^k consists of first-order sentences in which no subformula has
no more than k free variab...
[ Full version ]
Prof. Simon Kasif (Boston University and Children's Hospital, Boston)
Thursday, 10.07.2008, 13:30
We describe several new methodologies for developing and evaluating novel diagnostics for diabetes and cancer. In the first part of the talk we describe the Gene Network Enrichment Analysis (GNEA) methodology for identifying molecular markers for diabetes. This approach suggests that network based techniques might be more sensitive and accurate at identifying disease progression than single genes. At the same time we should pay significant attention to both the biological inter...
[ Full version ]
Prof. Hod Lipson (Mechanical & Aerospace Engineering, Computing & Information Science,
Cornell University)
Thursday, 10.07.2008, 11:30
This talk will describe new active learning processes for automated modeling of dynamical systems across a number of disciplines. One of the long-standing challenges in robotics is achieving robust and adaptive performance under uncertainty. One approach to adaptive behavior is based on self-modeling, where a system continuously evolves multiple simulators of itself in order to make useful predictions. The robot is rewarded for actions that cause disagreement among predictions of ...
[ Full version ]
Stefanie Hahmann (INPG, Grenoble)
Monday, 07.07.2008, 10:00
The recent ability to measure quickly and inexpensively dense
sets of points on physical objects has deeply influenced the
way engineers represent shapes in CAD systems, animation
software or in the game industry. Many researchers advocated
to completely bypass smooth surface representations, and to
stick to a dense mesh model throughout the design process. Yet
smooth analytic representations are still required in standard
CAD systems and animation software, for reas...
[ Full version ]
Or Meir, (Weizmann Institute of Science)
Sunday, 06.07.2008, 11:00
An error correcting code is said to be locally testable if there is a
test that
can check whether a given string is a codeword of the code, or rather
far from
the code, by reading only a constant number of symbols of the string.
Locally
Testable Codes (LTCs) were first explicitly studied by Goldreich and
Sudan (J. ACM 53(4)) and
since then several constructions of LTCs were suggested.
While the best known construction of LTCs achieves very efficient
parameter...
[ Full version ]
Thursday, 03.07.2008, 14:30
The central problem in computational mechanism design is the tension
between incentive compatibility and computational efficiency. We address
this question in the context of a novel mechanism design problem which we
call the combinatorial public project problem. We show that, even though
this problem is easily solvable from a computational viewpoint, and from
an economic perspective, no algorithm which is simultaneously truthful
and runs in polynomial-time can obtain any...
[ Full version ]
Prof. Bernard Widrow (EE, Stanford University)
Wednesday, 02.07.2008, 11:30
Regarding the workings of the human mind, memory and pattern recognition seem to be intertwined.
You generally do not have one without the other. Taking inspiration from life experience, a new
form of computer memory has been devised. Certain conjectures about human memory are keys to the central idea.
The design of a practical and useful "cognitive" memory system is contemplated, a memory system that
may also serve as a model for many aspects of human memory. The new memory...
[ Full version ]
Wednesday, 25.06.2008, 16:00
With the recent advances in technology, we are faced with the need to process increasingly larger amounts of data
increasingly faster.
Originally, a problem was considered computable if there was an algorithm that could decide it in finite time given any
input instance. Later came the notion of polynomial time computation, and then the possibility of making computations
faster through use of parallel machines. However, the algorithms involved still face the obvious obstacle of...
[ Full version ]
Amnon Shashua (School of Engineering and Computer Science, Hebrew University of Jerusalem)
Tuesday, 24.06.2008, 14:00
Quite a number of problems involving inference from data, whether visual data or otherwise, fall into the category of optimization. I will describe a general scheme for sequential and parallel (message passing) update rules based on the framework of Fenchel duality. The presentation will be done in the context of two applications (i) distributed
(parallel) Support-Vector-Machines (SVM) for handling large scale problems (to be presented at CVPR'08), (ii) Graphical Model inference...
[ Full version ]
Sven Koenig (University of Southern California)
Tuesday, 24.06.2008, 11:30
Path planning for robots in unknown terrain has been studied in both theoretical robotics and theoretical computer science. However, empirical robotics researchers have often developed their own planning methods. These planning methods have been demonstrated on mobile robots that solve complex real-world tasks and perform well in practice. I will discuss some of these planning methods (namely, Planning with the Freespace Assumption, Greedy Mapping and Greedy Localization) and show...
[ Full version ]
Eitan Grinspun (Computer Science, Columbia University)
Sunday, 22.06.2008, 14:00
What does it take to build fast simulators for cloth and thin
shells—in particular to do so with code that is simple to implement,
debug, and maintain? Our group's belief is that simple geometric
observations help to address all these goals. I will survey my
students' recent work on the discrete differential geometry of thin,
flexible surfaces, and focus on a couple simple geometric observations
that lead to efficient and simple models for bending, stretching, and
artisti...
[ Full version ]
Avinatan Hassidim, (Hebrew University of Jerusalem)
Sunday, 22.06.2008, 11:00
We introduce a variant of Quantum Multi Prover Interactive Proofs (QMIP),
where the provers do not share entanglement, the communication
between the verifier and the provers is quantum, but the provers
are unlimited in the classical communication between them.
At first, this model may seem very weak, as provers who exchange
information seem to be equivalent in power to a simple prover.
This in fact is not the case - we show that any language in NEXP
can be recognized in thi...
[ Full version ]
Alan Oppenheim (Massachusetts Institute of Technology)
Sunday, 15.06.2008, 16:00
Digital processing of analog signals naturally requires a
representation of continuous time signals as a discrete-time
sequence. The most common representation of this type is based on
the Shannon-Nyquist sampling theorem which provides an exact
representation for band limited signals sampled at a sufficiently
high rate but leads to aliasing error when the signal is under
sampled. In this talk a number of other approaches are discussed
to obtai...
[ Full version ]
Micha Sammeth (Genomic Regulation, Barcelona, Spain)
Sunday, 15.06.2008, 13:30
Eukaryotic splicing structures are known to involve a high degree of alternative forms derived from a premature transcript by alternative splicing (AS). With the advent of new sequencing technologies, evidence for new splice forms becomes more and more easily availablebit by bit revealing that the true splicing diversity of "AS events often comprises more than two alternatives and therefore cannot be sufficiently described by pairwise comparisons as conducted in analyzes hitherto....
[ Full version ]
Venkatesan Guruswami, (Washington University)
Sunday, 15.06.2008, 13:30
Classical results in high-dimensional geometry from the 1970's state that a random subspace of R^N of dimension N/2 has "distortion" O(1) w.h.p where the distortion is measured by sqrt{N} times the largest ratio of the L_2 to L_1 norms of a vector in the subspace. (So each vector in the subspace has its mass spread nearly evenly over a constant fraction of the coordinates.) This result is a particular example of the use of the probabilistic method, a technique which is ubiquitous ...
[ Full version ]
Eli Biham (Computer Science, Technion)
Wednesday, 11.06.2008, 16:30
Abstarct:In this talk we describe the ciphers and protocols used for
the GSM cellular phone network, and discuss the (in)security of the
system. We describe several techniques to attack the ciphers A5/2 and
A5/1, and how they can be applied as a ciphertext-only attack. We also
show that active attacks on the protocols can recover keys of ciphers that
are not used during that transmission. As a result, it is possible to
listen in to GSM phone conversations, steal calls during...
[ Full version ]
Hilla Ben-Yaacov (EE, Technion)
Tuesday, 10.06.2008, 11:30
3D Objects Description and Classification by Implicit Polynomials
...
[ Full version ]
Zeev Dvir (Weizmann Institute)
Sunday, 08.06.2008, 11:30
A Kakeya set in F^n, where F is a finite field, is a set containing a line in every direction. The finite field Kakeya conjecture states that the size of such sets is bounded from below by C_n*|F|^n, where C_n depends only on the dimension n. I will talk about the recent proof of this conjecture and its connection to problems in theoretical computer science....
[ Full version ]
John Smolin (IBM Watson at Yorktown Hights, NY)
Tuesday, 20.05.2008, 14:30
The black hole information loss paradox has plagued scientists since
Hawking's discovery that black holes evaporate thermally in contradiction to
the unitarity expected by quantum mechanics. I will explain just what the
problem is, and show that one of the central presumptions of the debate is
incorrect.
It had been thought that (obviously) either all the information must remain in
the black hole until it has completely evaportate OR it must leak out earlier.
Howeve...
[ Full version ]
CGGC Seminar: Gill Barequet (Computer Science, Technion)
Sunday, 18.05.2008, 14:00
A $d$-dimensional polycube is a connected set of $d$-dimensional cubes on an
orthogonal lattice, where connectivity is through $(d-1)$-dimensional faces.
A polycube is said to be proper in $d$ dimensions if it spans all the $d$
dimensions, that is, the convex hull of the centers of all its cubes is
$d$-dimensional.
We prove a few new formulae for the numbers of (proper and total) polycubes,
and show that (2d-3)e + O(1/d) is the asymptotic growth rate of the number
o...
[ Full version ]
Wednesday, 14.05.2008, 16:00
Model checking is an automated technique for checking
whether or not a given system model fulfills a desired property,
described as a temporal logic formula. Yet, as real models tend to
be very big, model checking encounters the state-explosion problem,
which refers to its high space requirements. Two of the most
successful approaches to fighting the state-explosion problem in
model checking are abstraction and compositional model checking.
Abstractions hide certain detai...
[ Full version ]
Kai Hormann (Clausthal Institute of Technology)
Tuesday, 13.05.2008, 11:00
Fluid simulations typically produce complex three-dimensional
iso-surfaces whose geometry and topology change over time.
The standard way of representing such dynamic geometry is by a
set of iso-surfaces that are extracted individually at certain
time steps. An alternative strategy is to represent the whole
sequence as a four-dimensional tetrahedral mesh. The iso-surface
at a specific time step can then be computed by intersecting the
tetrahedral mesh with a three-dimens...
[ Full version ]
CGGC Seminar: Gadi Aleksandrowicz (Computer Science, Technion)
Sunday, 11.05.2008, 13:00
$d$-dimensional polycubes are the generalization of planar polyominoes
to higher dimensions. That is, a $d$-D polycube of size $n$ is a connected
set of $n$ cells (hypercubes) of an orthogonal $d$-dimensional lattice,
where connectivity is through $(d-1)$-dimensional faces of the cells.
Computing $A_d(n)$, the number of distinct $d$-dimensional polycubes of
size $n$, is a long-standing elusive problem in discrete geometry.
In a previous work we described the generalization ...
[ Full version ]
Yoram Bresler (University of Illinois at Urbana-Champaign)
Tuesday, 06.05.2008, 11:30
We survey applications of classical and of time-sequential sampling
theory and some of its recent extensions, respectively, in two
complementary areas. First to reduce the computation for tomographic
reconstruction from $O(N^3)$
to $O(N^2\log N)$ for an $N \times N$ image, with similar acceleration
for 3D images. And second, to reduce acquisition requirements for
dynamic imaging below those predicted by classical theory.
In both areas, the savings demonstrated in practic...
[ Full version ]
Adi Shraibman (Math & CS, Weizmann Inst.)
Tuesday, 06.05.2008, 10:30
We will describe the results from a series of papers, starting from papers of Forster, Ben-David and others to more recent works of Sherstov
and Chattopadhyay. The topics of these papers are learning, communication and computational complexity. Examples of some contributions of these papers are
1. The first lower bound on unbounded error probabilistic communication complexity.
2. Definition and study of learning theoretic quantities such as margin complexity...
[ Full version ]
Wednesday, 16.04.2008, 16:30
In this work we study a wide range of online covering and
packing problems with various objectives. We provide a unified approach,
based on a clean primal-dual method, for the design and analysis of online
algorithms for this large class of problems. In particular, our
analysis uses weak duality rather than a tailor made (i.e., problem
specific) potential function. We show how to apply the primal-dual
approach to many interesting problems. Examples consist of the ski
rental prob...
[ Full version ]
Alfred M.Bruckstein (Computer Science, Technion)
Sunday, 13.04.2008, 14:30
I shall discuss the many ways one can do optimal image or
signal denoising, and the difficulties in selecting between
the various optimal solutions.
...
[ Full version ]
Wednesday, 09.04.2008, 16:30
A basic construct in many distributed systems is the detection of global properties over distributed data. Examples include a wireless sensor network, where we would like to receive an alert every time the average of the temperature readings taken by the sensors exceeds a given threshold, or a distributed search engine, where we would like to determine the set of search terms whose frequency of use exceeds a given threshold.
Until recently, research has focus on detecting gl...
[ Full version ]
Yair Moshe (Computer Science, Haifa Univ.)
Tuesday, 08.04.2008, 11:30
Motion estimation plays an important role in modern video coders. In such coders, motion is estimated using a block matching algorithm that estimates the amount of motion on a block‐by‐block basis. A full search technique for finding the best matching blocks delivers good accuracy but is usually not practical because of its high computational complexity. In this talk, a novel fast block‐based motion estimation algorithm is presented. This algorithm uses an efficient projecti...
[ Full version ]
Avraham Ben-Aroya (Tel-Aviv Univ.)
Sunday, 06.04.2008, 11:00
Reingold, Vadhan and Wigderson [FOCS'00] introduced the graph zig-zag
product. This product combines a large graph and a small graph into one
graph, such that the resulting graph inherits its size from the large graph,
its degree from the small graph and its spectral gap from both. Using this
product they gave the first fully-explicit combinatorial construction of
expander graphs. They showed how to construct $D$-regular graphs having
spectral gap $1-O(D^{-\frac{1}{3}})$. In...
[ Full version ]
Yaron Lipman (Computer Science, Tel Aviv Univ.)
Sunday, 30.03.2008, 13:00
CGGC Seminar: Deforming objects in a "shape preserving" manner is a challenging problem
with various applications in geometric modeling and computer graphics. In
this talk I will examine two geometric-driven approaches to the problem.
First, reformulating the problem using the notion of Cartan's moving frames
reveals an intriguing relationship between harmonic maps into the group of
rotations and shape preserving deformations. The moving frames are known in
di...
[ Full version ]
Sunday, 30.03.2008, 11:00
The basic foundations of the Internet are surprisingly vulnerable to simple attacks that could be mounted
by rogue hackers or result from simple misconfiguration. One area that remains particularly exposed to
attacks is routing: packets usually traverse many intermediate routers on their way from a source to their
destination. Since these routers may belong to other (possibly malicious) organizations, how do we
guarantee that our packets arrive as intended at their desti...
[ Full version ]
Alexander Brook (Computer Science, Technion)
Monday, 17.03.2008, 10:30
The lecture deals with invariance in image processing, with emphasis on
invariance of shapes and curves. There are two main topics. First, we
consider scale invariance, review existing scale invariant measures and
discuss the different ways in which an algorithm can be scale invariant.
We then point out some dangers inherent in fully scale invariant
algorithms. Second, we present an equi-affine invariant shape matching
mathod, which is also invariant to resampling, robust to...
[ Full version ]
Robert Krauthgamer (Weizmann Institute and IBM Almaden)
Sunday, 02.03.2008, 16:30
The edit distance between two strings is defined as the number of
insertions/deletions/substitutions needed to transform one string into
the other. This distance plays a key role in several fields like
computational biology and text processing.
Edit distance appears to be notoriously difficult to deal with
algorithmically (both theoretically and in practice). However, no
nontrivial computational lower bounds for it are known, and the only
negative evidence known i...
[ Full version ]
Prof. David Chillag (Mathematics, Technion)
Wednesday, 27.02.2008, 16:30
The mathematical theorem having the longest
proof is the classification of the finite simple
groups. In this talk we will try to understand the
statement of the theorem and its importance.
The lecture will be in Hebrew.
ההרצאה תינתן בעברית....
[ Full version ]
Cristian Canton-Ferrer (Image and Video Processing Group
UPC, Barcelona Spain)
Tuesday, 26.02.2008, 11:00
Multi-camera environments allow exploiting spatial redundancy that might be
used by analysis algorithms to produce more robust results. These algorithms
may be applied for a variety of tasks ranging from human 3D scene volume
reconstruction, human motion capture, gesture analysis or focus of attention
tracking. Combining this multi-camera information with audio information
would allow defining multimodal data analysis systems. This talk will
present two research lines within...
[ Full version ]
Zeev Dvir (Weizmann Institute of Science)
Sunday, 24.02.2008, 11:00
We show that lower bounds for bounded depth arithmetic circuits imply
derandomization of polynomial identity testing for bounded depth
arithmetic circuits. More formally, if there exists an explicit
polynomial f(x_1,...,x_m) that cannot be computed by a depth d
arithmetic circuit of small size then there exists an efficient
deterministic algorithm to test whether a given depth d-8 circuit is
identically zero or not (assuming the individual degrees of the tested
circuit are ...
[ Full version ]
Sunday, 17.02.2008, 14:30
Our world is filled with nonrigid objects that can deform and
bend. Nonrigid shapes appear at all scales - from our body, its
organs and tissues, to tiny bacteria and microscopic protein
molecules. When trying to analyze nonrigid shapes the richness of
the possible deformations poses a challenge due to the large
number of degrees of freedom. Thus, generally speaking, explicit
analysis of nonrigid objects has so far been avoided. Today, there
is a co...
[ Full version ]
Shira Nemirovsky, EE, Technion
Tuesday, 12.02.2008, 11:30
Random field models characterize the correlation among neighboring pixels in an image. Specifically, a wide-sense Markov model is obtained by assuming a separable correlation function for a 2D auto-regressive (AR) model. This model is applicable to image restoration, image compression, and texture classification and segmentation. In this work we explore the effect of sampling on statistical features of an image such as histogram and the autocorrelation function. We show that the M...
[ Full version ]
Udi Bahat (Systematics Ltd.)
Sunday, 10.02.2008, 13:00
"The Challenge", Definitions, GUI, basic and smart geometry in 2D and 3D,
basic model, "styled" model, shape manipulations, hybrid design, develop
challenge.
...
[ Full version ]
Irit Dinur (Weizmann Institute of Science)
Sunday, 10.02.2008, 11:00
Given a function f:[n]-->{0,1}, its k-wise direct product encoding
is the function F:[n]^k --> {0,1}^k defined by
F(x_1,..x_k) = (f(x_1),...,f(x_k))
This simple "encoding" is useful in PCP-like settings, when we want to
simulate reading k arbitrary values of f by accessing only a constant
number of inputs of the encoding F.
The main challenge is to test that a given F is indeed a "correct"
encoding of some f by reading only few (ultimately 2) random values in
F. This is ca...
[ Full version ]
Eran Eyal (CB, School of Medicine, University of Pittsburgh)
Thursday, 07.02.2008, 13:30
Multiple sequence alignments possess important information, not only about functionally and structurally important conserved regions, but also about coupling between residues. Recently we have developed a unique method to detect such coupling based on substitution matrices of residue pairs. I will describe the method and show how it performs at structural analysis tasks as contact prediction and discrimination of decoy sets. I will then show recent results that demonstrate how cor...
[ Full version ]
Shahar Dobzinski (CS @ Engineering, Huji)
Sunday, 03.02.2008, 11:00
Consider the following simple type of approximation algorithms: limit the set of possible outcomes, and completely optimize over the restricted subrange. In this talk we will study the power of this type of algorithms in two settings: multi-unit auctions, and combinatorial auctions with subadditive bidders.
From a game-theoretic point of view, our results provide a characterization of the power of the main positive result of mechanism design the VCG mechanism. Another ...
[ Full version ]
Pierre Alliez (CS, INRIA Sophia Antipolis)
Wednesday, 30.01.2008, 15:00
A Delaunay triangulation restricted to a surface is obtained by
filtering a 3D Delaunay triangulation. Filtering herein consists of
selecting a subset of the 3D Delaunay facets whose dual edges intersect
the surface. In this talk I will explain how the restricted Delaunay
triangulation paradigm can be used to simultaneously mesh a surface
together with its interior. I will then describe two mesh generation
algorithms along these lines and show a demo made with components fro...
[ Full version ]
Ori Honigman (EE, Technion)
Tuesday, 29.01.2008, 11:30
The complex diffusion process, recently introduced in image processing and computer vision by combining the linear diffusion equation and the 'free-particle' Schrödinger equation, is further generalized by incorporating the Schrödinger potential. We incorporate the potential into the complex diffusion equation, introducing some kind of 'dynamic boundary conditions' that serve as a filtering or even enhancing mechanism for textures. The Schrödinger potential is self-adopting to ...
[ Full version ]
Jeremy-Yrmeyahu Kaminski, Dept. of Mathematics, Holon Inst. of Technology
Monday, 28.01.2008, 10:30
1. Basic concepts in algebraic geometry (affine and projective varieties, morphisms, dimension, degree)
2. Computational tools: Groebner basis, resultants
3. Basic applications including solving polynomial systems
4. Some more involved constructions like Grassmannians and Chow forms.
...
[ Full version ]
Dan Halperin, Dept. of Computer Science, Tel Aviv University
Sunday, 27.01.2008, 13:00
The Computational Geometry Algorithms Library, CGAL,
is the largest software collection of algorithms and data
structures in Computational Geometry available today.
It started a little over a decade ago as a European research
project with a small number of partners and has grown
over the years to be a huge open source project. The
arrangement package of CGAL, developed at Tel Aviv
University, constructs, maintains, traverses, and answers
queries on two-dimensional arrangem...
[ Full version ]
Vinod Vaikuntanathan, MIT
Sunday, 27.01.2008, 11:00
We show that disjointness requires randomized communication
Omega(n^{1/2k} / (k-1)2^{k-1}2^{2^{k-1}}) in the general k-party
number-on-the-forehead model of complexity. The previous
best lower bound was Omega (\log n / k-1). By results of Beame, Pitassi, and
Segerlind, this implies 2^{n^{Omega(1)}} lower bounds on the size of tree-like Lovasz-Schrijver proof
systems needed to refute certain unsatisfiable CNFs, and super-polynomial lower bounds on the
size of any tree-lik...
[ Full version ]
Adi Shraibman, Weizmann Institute of Science
Sunday, 20.01.2008, 11:00
We show that disjointness requires randomized communication
Omega(n^{1/2k}/{(k-1)2^{k-1}2^{2^{k-1}})
in the general k-party number-on-the-forehead model of complexity. The
previous
best lower bound was Omega((log n)/k-1). By results of Beame, Pitassi, and
Segerlind, this implies 2^{n^{Omega(1)} lower bounds on the size of
tree-like Lovasz-Schrijver proof
systems needed to refute certain unsatisfiable CNFs, and
super-polynomial lower bounds on the
size of any tree-like p...
[ Full version ]
Monday, 14.01.2008, 11:00
Algorithms for video-based human motion capture typically assume the body shape is known a priori and is represented coarsely (e.g. using cylinders to model body parts). These body models stand in sharp contrast to the richly detailed 3D body models used by by animators. In this talk we describe a method for recovering detailed body shape models directly from images. Specifically, we represent the body using a recently proposed triangulated mesh model called SCAPE which employs a ...
[ Full version ]
Sunday, 13.01.2008, 11:00
In this paper we study multilinear formulas, monotone arithmetic circuits, maximal partition
discrepancy, best partition communication complexity and mixed two source extractors.
We will first define maximal partition discrepancy, and construct a function f that has
exponentially small maximal partition discrepancy. We will then review several application
of this construction:
1. The best partition communication complexity of f is \Theta(n).
...
[ Full version ]
Thursday, 10.01.2008, 11:00
This research concerns the image deblurring and noise removal problem in a variational framework.
Energy functionals in this study consist of a fidelity term and a regularizer that is based on
Mumford-Shah segmentation, such that the recovered image and its discontinuities set are simultaneously
extracted in the course of the deblurring process. We first consider the image deblurring problem in
the presence of Gaussian/impulsive noise and present a convergence result for t...
[ Full version ]
Tuesday, 08.01.2008, 11:30
Recent studies stressed the importance of objects comparison for category learning. Recently we compared adults' capacity to learn new categorization principle in a condition in which few exemplars were identified as belong to the same category, versus a condition in which exemplars were identified as belong to different categories. We found that almost all participants learned the categorization rule quite efficiently in the Same-Class Exemplars condition. In contrast, when not p...
[ Full version ]
Sunday, 06.01.2008, 11:30
This talk describes 3D virtual colonoscopy (VC), a combination of computed tomography (CT) scanning and volume visualization technology, and is poised to become the procedure of choice in lieu of the conventional optical colonoscopy for mass screening for colon cancer. The talk further discusses a novel pipeline of computer-aided detection (CAD) of colonic polyps – the precursor of colorectal cancer – complementing VC.
In VC, the patient abdomen is imaged by a helical CT scan...
[ Full version ]
Tuesday, 01.01.2008, 11:30
Non-rigid shape similarity is an important problem in computer vision and pattern recognition. Broadly speaking, non-rigid shape similarity criteria are divided into intrinsic and extrinsic, the first referring to the metric structure of the objects and the latter to the geometry of the shapes in the Euclidean space. Both criteria have their advantages and disadvantages; extrinsic similarity is sensitive to non-rigid deformations of the shapes, while intrinsic similarity is sensit...
[ Full version ]