Law of large numbers with continuous dependency

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
1
down vote

favorite












Suppose we have an i.i.d. sequence of random variables $X_n sim X$, and a sequence $Y_n to y_0$ in probability, where $y_0$ is a constant. Suppose also that $f : mathbbR^2 to mathbbR$ is continuous.



First question: Is it the case that



$$frac1n sum_i=1^n f(X_i, Y_n) to mathbbE[f(X, y_0)]$$



in probability? I suspect the answer is no but I am having trouble finding a counterexample.




The result does hold provided we put additional assumptions on $f$. For instance, either



(1) that $f(x, cdot)$ is $L(x)$-Lipschitz for each $x$, with $mathbbE[L(X)] < infty$;



or



(2) that $f(cdot, y) to f(cdot, y_0)$ in the supremum norm whenever $y to y_0$.



Second question: is there a relationship between these two conditions? That is, is one strictly weaker than the other? For instance, if (1) holds with $L(x)$ bounded then it is clear that (2) holds. But how about more generally?










share|cite|improve this question





















  • I would search for an example where $Ef(X,y)$ is not continuous in $y$ at $y_0$, or for examples of non-uniform integrability.
    – kimchi lover
    Sep 3 at 14:53














up vote
1
down vote

favorite












Suppose we have an i.i.d. sequence of random variables $X_n sim X$, and a sequence $Y_n to y_0$ in probability, where $y_0$ is a constant. Suppose also that $f : mathbbR^2 to mathbbR$ is continuous.



First question: Is it the case that



$$frac1n sum_i=1^n f(X_i, Y_n) to mathbbE[f(X, y_0)]$$



in probability? I suspect the answer is no but I am having trouble finding a counterexample.




The result does hold provided we put additional assumptions on $f$. For instance, either



(1) that $f(x, cdot)$ is $L(x)$-Lipschitz for each $x$, with $mathbbE[L(X)] < infty$;



or



(2) that $f(cdot, y) to f(cdot, y_0)$ in the supremum norm whenever $y to y_0$.



Second question: is there a relationship between these two conditions? That is, is one strictly weaker than the other? For instance, if (1) holds with $L(x)$ bounded then it is clear that (2) holds. But how about more generally?










share|cite|improve this question





















  • I would search for an example where $Ef(X,y)$ is not continuous in $y$ at $y_0$, or for examples of non-uniform integrability.
    – kimchi lover
    Sep 3 at 14:53












up vote
1
down vote

favorite









up vote
1
down vote

favorite











Suppose we have an i.i.d. sequence of random variables $X_n sim X$, and a sequence $Y_n to y_0$ in probability, where $y_0$ is a constant. Suppose also that $f : mathbbR^2 to mathbbR$ is continuous.



First question: Is it the case that



$$frac1n sum_i=1^n f(X_i, Y_n) to mathbbE[f(X, y_0)]$$



in probability? I suspect the answer is no but I am having trouble finding a counterexample.




The result does hold provided we put additional assumptions on $f$. For instance, either



(1) that $f(x, cdot)$ is $L(x)$-Lipschitz for each $x$, with $mathbbE[L(X)] < infty$;



or



(2) that $f(cdot, y) to f(cdot, y_0)$ in the supremum norm whenever $y to y_0$.



Second question: is there a relationship between these two conditions? That is, is one strictly weaker than the other? For instance, if (1) holds with $L(x)$ bounded then it is clear that (2) holds. But how about more generally?










share|cite|improve this question













Suppose we have an i.i.d. sequence of random variables $X_n sim X$, and a sequence $Y_n to y_0$ in probability, where $y_0$ is a constant. Suppose also that $f : mathbbR^2 to mathbbR$ is continuous.



First question: Is it the case that



$$frac1n sum_i=1^n f(X_i, Y_n) to mathbbE[f(X, y_0)]$$



in probability? I suspect the answer is no but I am having trouble finding a counterexample.




The result does hold provided we put additional assumptions on $f$. For instance, either



(1) that $f(x, cdot)$ is $L(x)$-Lipschitz for each $x$, with $mathbbE[L(X)] < infty$;



or



(2) that $f(cdot, y) to f(cdot, y_0)$ in the supremum norm whenever $y to y_0$.



Second question: is there a relationship between these two conditions? That is, is one strictly weaker than the other? For instance, if (1) holds with $L(x)$ bounded then it is clear that (2) holds. But how about more generally?







probability-theory probability-limit-theorems






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Sep 3 at 9:57









12qu

1207




1207











  • I would search for an example where $Ef(X,y)$ is not continuous in $y$ at $y_0$, or for examples of non-uniform integrability.
    – kimchi lover
    Sep 3 at 14:53
















  • I would search for an example where $Ef(X,y)$ is not continuous in $y$ at $y_0$, or for examples of non-uniform integrability.
    – kimchi lover
    Sep 3 at 14:53















I would search for an example where $Ef(X,y)$ is not continuous in $y$ at $y_0$, or for examples of non-uniform integrability.
– kimchi lover
Sep 3 at 14:53




I would search for an example where $Ef(X,y)$ is not continuous in $y$ at $y_0$, or for examples of non-uniform integrability.
– kimchi lover
Sep 3 at 14:53















active

oldest

votes











Your Answer




StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: false,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













 

draft saved


draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2903715%2flaw-of-large-numbers-with-continuous-dependency%23new-answer', 'question_page');

);

Post as a guest



































active

oldest

votes













active

oldest

votes









active

oldest

votes






active

oldest

votes















 

draft saved


draft discarded















































 


draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2903715%2flaw-of-large-numbers-with-continuous-dependency%23new-answer', 'question_page');

);

Post as a guest













































































這個網誌中的熱門文章

How to combine Bézier curves to a surface?

Mutual Information Always Non-negative

Why am i infinitely getting the same tweet with the Twitter Search API?