Private data analytics systems preferably provide required analytic accuracy to analysts and specified privacy to individuals whose data is analyzed. Devising a general system that works for a broad range of datasets and analytic scenarios has proven to be difficult.
Despite the advent of differentially private systems with proven formal privacy guarantees, industry still uses their inferior ad-hoc predecessors that are able to provide better analytic accuracy. Differentially private mechanisms often need to add large amounts of noise to statistical results, which impairs their use-ability.
In my thesis I follow two approaches to improve use-ability of private data analytics systems in general and differentially private systems in particular. First, I revisit ad-hoc mechanisms and explore the possibilities of systems that do not provide Differential Privacy or only a weak version thereof. In my second approach I use the insights gained before to propose UniTraX, the first differentially private analytics system that allows to analyze part of a protected dataset without affecting other parts and without having to give up on guaranteed accuracy.