The Monte Carlo method is one of the widely used numerical methods for simulating probability distributions. Its convergence rate is independent of the dimension but slow. Quasi-Monte Carlo methods, which can be seen as a deterministic version of Monte Carlo methods, have been developed to improve the convergence rate to achieve greater accuracy, which partially depends on generating samples with small discrepancy. Putting the quasi-Monte Carlo idea into statistical sampling is a good way to improve the convergence rate and widen practical applications. In this thesis we focus on constructing low-discrepancy point sets with respect to non-uniform target measures using the acceptance-rejection sampler. We consider the acceptance-rejection samplers based on different driver sequences. The driver sequence is chosen such that the discrepancy between the empirical distribution and the target distribution is small. Hence digital nets, stratified inputs and lattice point sets are used for this purpose. The central contribution in this work is the establishment of discrepancy bounds for samples generated by acceptance-rejection samplers. Together with a Koksma-Hlawka type inequality, we obtain an improvement of the numerical integration error for non-uniform measures. Furthermore we introduce a quality criterion for measuring the goodness of driver sequences in the acceptance-rejection method. Explicit constructions of driver sequences yield a convergence order beyond plain Monte Carlo for samples generated by the deterministic acceptance-rejection samplers in dimension one. The proposed algorithms are numerically tested and compared with the standard acceptance-rejection algorithm using pseudo-random inputs. The empirical evidence confirms that adapting low-discrepancy sequences in the acceptance-rejection sampler outperforms the original algorithm.