In Chapter one, we explore the joint behaviour of the summands of a random walk when their mean value goes to infinity as its length increases. It is proved that all the summands must share the same value, which extends previous results in the context of large exceedances of finite sums of i.i.d. random variables. In Chapter two, we state a conditional Gibbs theorem for a random walk (X1, ..,Xn) conditioned on an extreme deviation event. It is proved that when the summands have light tails with some additional regulatity property, then the asymptotic conditional distribution of X1 can be approximated by the tilted distribution in variation norm, extending therefore the classical LDP case. The third Chapter explores Maximum Likelihood in parametric models in the context of Sanov type Large Deviation Probabilities. MLE in parametric models under weighted sampling is shown to be associated with the minimization of a specific divergence criterion defined with respect to the distribution of the weights. Some properties of the resulting inferential procedure are presented; Bahadur efficiency of tests is also considered in this context.