Assignment 4: Graphs, Matrices and Tensors


The deadline is July 21st at 10:00 Saarbrücken standard-time. You are free to hand in earlier. You will have to choose one topic from the list below, read the articles, and hand in a report that critically discusses this material and answers the assignment questions. Reports should summarise the key aspects, but more importantly, should include original and critical thought that show you have acquired a meta level understanding of the topic – plain summaries will not suffice. All sources you use should be appropriately referenced, any text you quote should be clearly identified as such. The expected length of a report is between 3 to 5 pages, but there is no limit.

For the topic of your assignment, choose one of the following:

  1. All Those Different Models

    Read the two survey articles on tensor factorizations: [1] and [2]. Together, they list dozen different tensor decomposition models (even without counting the non-negative variants). Why are there so many differnt models? Are they all needed? Are some just trivial variants of others? What are the most and least important ones (in your opinion) and why?

  2. A Tensor or Just a Bunch of Matrices

    Read [3] and [4]. These papers present two diffent ways of analysing data using tensors. But do they really need tensors, or are they just working on a collection of matrices?

    [3] uses time in the third mode: is their use of tensor factorization still justified? Or do they ignore the temporal order? Try to identify other potential weaknesses on these papers.

  3. Eigen and Tensor Faces

    Read [5]. How does TensorFaces utilize the tensor decompositions. How does it improve over Eigenfaces [6] (if at all)? Could you replicate TensorFaces using just Eigenfaces? Is TensorFaces scalable (in time and space)? Is TensorFaces are special application, or does it exhibit some more general theme when moving from matrices (2-way tensors) to \(N\)-way tensors?

  4. Dot the i's and Cross the t's

    The dot product graphs, as explained in the lecture, are undirected graphs whose adjacency matrix \(A\) can be expressed as \(A = \tau(XX^T)\), where \(\tau\) is the elementwise threshold operator. If the graph is directed, the adjacency matrix becomes unsymmetric and the factorization assumes the form \(A = \tau(XY^T)\).

    In this assignment, we consider two approaches to find approximate dot product graphs of fixed rank. That is, we are given the adjacency matrix \(A\) and an integer \(k\), and our goal is to find \(n\times k\) matrix \(X\) and \(n\times k\) matrix \(Y\) such that \(\tau(XY^T)\) is as close to \(A\) as possible (measured as the Hamming distance, i.e. number of differences, between \(A\) and \(\tau(XY^T)\)).

    To this end, consider first Boolean Matrix Factorization (BMF) [7], as discussed earlier in the course. If we apply BMF to \(A\), would the resulting factor matrices be good examples of \(X\) and \(Y\) for dot product graphs? What would be their strengths and weaknesses? Would you consider using BMF for this task?

    As the second approach, read [8]. Here, authors present an approach for computing the Logistic PCA (LPCA). How is LPCA related to our problem? Could their algorithm be used in solving it? Why or why not? What, if anything, can you say about the hardness of the approximate directed dot product graph problem given the LPCA paper?

Return the assignment by email to tada-staff@mpi-inf.mpg.de by 21 July, 1000 hours. The subject of the email must start with [TADA]. The assignment must be returned as a PDF and it must contain your name, matriculation number, and e-mail address together with the exact topic of the assignment.

Grading will take into account both Hardness of questions, as well as whether you answer the Bonus questions.



References

You will need a username and password to access the papers outside the MPI network. Contact the lecturer if you don't know the username or password.

[1] Acar, E. & Yener, B. Unsupervised Multiway Data Analysis: A Literature Survey. IEEE Trans. Knowl. Data En., 21(1):6-20, 2009.
[2] Kolda, T.G. & Bader, B.W. Tensor decompositions and applications. SIAM Rev., 51(3):455-500, 2009.
[3] Dunlavy, D., Kolda, T.G. & Acar, E. Temporal Link Prediction Using Matrix and Tensor Factorizations. ACM Trans. Knowl. Discov. Data, 5(2), 2011.
[4] Nickel, M., Tresp, V. & Kriegel, H.P. Factorizing YAGO: Scalable Machine Learning for Linked Data. In WWW '12, pages 271-280, 2012.
[5] Vasilescu, M.A.O. & Terzopoulos, D. Multilinear Analysis of Image Ensembles: TensorFaces. In ECCV '02, pages 447-460, 2002.
[6] Turk, M. & Pentland, A. Eigenfaces for Recognition. J. Cognitive Neurosci., 3(1):71-86, 1991.
[7] Miettinen, P., Mielikäinen, T., Gionis, A., Das, G. & Mannila, H. The Discrete Basis Problem. TKDE, 20(10):1348-1362, IEEE, 2008.
[8] Schein, A.I., Saul, L.K. & Ungar, L.H. A Generalized Linear Model for Principal Component Analysis of Binary Data. In AISTATS '03, 2003.