We have seven papers for you this month. Our potpourri includes two papers apiece on each of the following themes: Distribution Testing, property testing with a quantum twist, graph property testing, and finally two papers on testing function properties. A featured paper this month covers progress on optimal non-adaptive junta testers. Without further ado, let’s dig in. As usual, please let us know if I missed any papers.
Testing \(C_k\) freeness in bounded-arboricity graphs by Talya Eden, Reut Levi, and Dana Ron (arXiv) Our post from July 2021 highlighted an open problem posed by Goldreich. This problem asks if it is possible to transform property testers for bounded degree graphs to property testers for unbounded degree graphs with general arboricity. The featured paper answers the question Goldreich posed in the negative. In particular, testing \(C_4\) freeness in bounded arboricity graphs (with possibly unbounded degree) already admits an \(\Omega(n^{1/4})\) one-sided lowerbound. Up to \(\log\) factors, the paper also proves a matching upperbound. The same bounds hold for \(C_5\)-freeness. Further, for every \(k \geq 6\), the paper proves an \(\Omega(n^{1/3})\) one-sided lower-bound.
Directed Isoperimetry and Monotonicity Testing: A Dynamical Approach by Renato Ferreira Pinto Jr (arXiv) The featured paper considers a classic of property testing. So, you want to test whether an input real-valued function \(f \colon [0,1]^d \to \mathbb{R}\) over the solid cube is monotone or whether it is \(\varepsilon\)-far from being monotone. It is convenient to require that the input function better be differentiable and Lipschictz (for those of you keeping scores, this should remind you of our post from July 2023 where we covered a previous work in the same setting again by Renato). Motivated by the success of testers developed for the boolean hypercube domain, a natural analogy suggests to have in addition to a standard value oracle, an additional derivative oracle which takes as input a point \(x \in [0,1]^d\) and a direction \(\mathbf{v} \in \mathbb{R}^d\). This oracle allows you to query directional derivative at \(x\) along \(\mathbf{v}\). Renato’s punchline is a directed Poincare Inequality which in the above setup, connects the distance to monotonicity to the square root of directed analog of a suitable notion of influence. The techniques used in the paper seem intriguing. They are inspired by the original proof of the classic Poincare Inequality. In particular, as Renato notes “the main theme of our proof of this result is the study of the convergence of a partial differential equation.”
Optimal Non-Adaptive Tolerant Junta Testing via Local Estimators by Shivam Nadimpalli, Shyamal Patel (arXiv) The problem of Tolerant Junta testing is no stranger to our regular readers. This paper is fairly intriguing for those of you who care about testing function properties. As the abstract notes, this is the first paper to nail down the true (non-adaptive) query complexity for tolerantly testing some natural property of boolean functions. The main result of this paper presents a lower bound of \(2^{\widetilde{\Omega}(\sqrt{k \log(1/\varepsilon)})}\) on the task of non-adaptive tolerant \(k\)-junta testing. The paper presents an almost matching non-adaptive algorithm as well.
Testing Intersectingness of Uniform Families by Ishay Haviv and Michal Parnas (arXiv) Let \(\mathcal{F}\) denote an intersecting family all sets in which are subsets of an underlying \(n\)-element universe. This means that for any \(F_1, F_2 \in \mathcal{F}\), you have that \(F_1 \cap F_2 \neq \emptyset\). Some of you might immediately recall Erdos-Ko-Rado theorem which asserts an upperbound on the size of such an intersecting family where every set has size \(k\). Another famous result is Lovasz’s (positive) resolution of Kneser’s conjecture which asserts an lowerbound on the number of intersecting families you need to cover all \(k\)-subsets of \([n]\). For ease of discussion, let us follow the authors and cook up a property testing problem \(\textsf{INTERSECTING}_{n,k, \varepsilon}\). In this problem, you are given access to the indicator \(f \colon {[n] \choose k} \to \{0,1\}\) encoding the family \(\mathcal{F}\) and you ask whether \(\mathcal{F}\) is intersecting or whether it is \(\varepsilon\)-far from it. Recently, Chen-De-Li-Nadimpalli-Servedio explored the non-uniform-set-size variant of this problem (which we covered here). They presented one-sided algorithms with a non-adaptive query bound of \(2^{\widetilde{O}(\sqrt{n \log(1/\varepsilon)})}\) for this problem and they also showed an almost matching lowerbound. The featured paper contrasts these results with the situation you obtain when all the set sizes are indeed the same (that is, the paper explores the testability of \(\textsf{INTERSECTING}_{n,k, \varepsilon}\). Of the numerous results in the paper, let me highlight one: you can test \(\textsf{INTERSECTING}_{n,k, \varepsilon}\) with two-sided error with a mere \(O\big(\log n \cdot \frac{1}{\varepsilon} \big)\) queries. This result holds for \(\varepsilon \geq \Omega(k/n)^r\) for a fixed \(r\).
Distribution-Free Testing of Decision Lists with a Sublinear Number of Queries by Xi Chen, Yumou Fei and Shyamal Patel (arXiv) Decision lists are a popular and convenient way to represent a boolean function which is learnable from examples. In more detail, a decision list is a collection of pairs \((\alpha_1, \beta_1), (\alpha_2, \beta_2), \ldots (\alpha_k, \beta_k)\) (here, \(\alpha_j\)’s denote literals and \(\beta_j\)’s are bits). This list defines a boolean function \(f \colon \{0,1\}^n \to \{0,1\}\) as follows: for any \(\mathbf{x} \in \{0,1\}^n\), you let \(f(x) = \beta_j\) where \(j\) is the smallest index such that \(\alpha_j\) is satisfied by \(\mathbf{x}\). With this (not super rigorous) definition out of the way, I can now tell you about the main result of this featured paper. The main theorem of this paper asserts that in the distribution-free framework for property testing, you still get sublinear time algorithms for testing decision lists. In particular, thanks to this paper, now you can engineer a two-sided adaptive distribution free algorithm for testing decision lists which makes runs in time \(\widetilde{O}(n^{11/12})\).
Simple algorithms to test and learn local Hamiltonians by Francisco Escudero GutiĆ©rrez (arXiv) The featured paper considers the task of (tolerantly) testing and learning an \(n\)-qubit \(k\)-local Hamiltonian from queries to its evolution operator. The main result of the paper asserts that the task of Tolerant Hamiltonian Locality Testing can be done with a mere \(1/(\varepsilon_2 – \varepsilon_1)^8\) queries to the evolution operator.
Local Test for Unitarily Invariant Properties of Bipartite Quantum States by Kean Chen, Qisheng Wang, Zhicheng Zhang (arXiv) Lots of investigations on quantum entanglement consider bipartite quantum states. The featured paper considers the task of testing properties of bipartite pure states. The paper begins by recalling a helpful duality between a property of bipartite pure states being unitarily invariant on one part, and the property being locally testable on the other part. This duality does not offer any insights into the query complexity of the local tester. The main result of the paper proves that the local tester indeed achieves optimal sample complexity over all global testers.
Learning Intersections of Halfspaces with Distribution Shift: Improved Algorithms and SQ Lower Bounds by Adam R. Klivans, Konstantinos Stavropoulos, Arsen Vasilyan (arXiv) Consider the standard setup of supervised learning with a twist. Namely, the distribution from which you receive samples in the training phase (\(\mathcal{D}\)) and the distribution from which the samples are taken in the test phase (\(\mathcal{D}’\)) are not necessarily the same (and hence the name — distribution shift). For example, this situation may arise if you use one set of equipments in the training phase and another during the test phase. This model was explored in a previous work by the same set of authors where they considered the task of obtaining a low-error classifier (on \(\mathcal{D}’\)) when they are additionally told that the training samples pass some helpful test. In this paper, the authors explore the problem of testing intersections of halfspaces with respect to Gaussian Training Distributions. The main contribution of the paper is a set of new positive results which extend the results from PAC learnability to the learnability under the new model. Indeed, under some reasonably mild assumptions, the bounds in the new model match the bounds from the standard PAC model. For quantitative details, please refer to the paper.