When is a symmetric matrix invertible?
Clash Royale CLAN TAG#URR8PPP
up vote
4
down vote
favorite
My professor always writes on the board:
$A$ is $m times n$, assuming that the vectors of $A$ form a basis, then $A^TA$ is always invertible.
one thing I know is that $A^TA$ is always symmetric, but I'm not sure about the conditions on a symmetric matrix needed to ensure that it is invertible?
linear-algebra matrices inverse symmetric-matrices
 |Â
show 4 more comments
up vote
4
down vote
favorite
My professor always writes on the board:
$A$ is $m times n$, assuming that the vectors of $A$ form a basis, then $A^TA$ is always invertible.
one thing I know is that $A^TA$ is always symmetric, but I'm not sure about the conditions on a symmetric matrix needed to ensure that it is invertible?
linear-algebra matrices inverse symmetric-matrices
3
What do you nean by "the vectors for $A$"?
â Robert Lewis
Jul 9 '17 at 17:38
Sorry, should be "vectors of A" - it's my english.
â nundo
Jul 9 '17 at 17:49
1
There is a long list of conditions that are equivalent to the condition that a matrix is invertible, all of which are equally valid for symmetric matrices. Are you instead asking for conditions on a matrix $A$ to ensure that $A^TA$ is invertible?
â JMoravitz
Jul 9 '17 at 17:55
1
The words you need are "row" and "column." With $m$ rows and $n$ columns, we find $A^T A$ is a square of size $n.$ When $m geq n$ and the $n$ columns of $A$ are independent, then $A^T A$ is also of rank $n,$ therefore invertible.
â Will Jagy
Jul 9 '17 at 17:56
@JMoravitz yes, sorry perhaps mis-asked the question, thank you.
â nundo
Jul 10 '17 at 5:27
 |Â
show 4 more comments
up vote
4
down vote
favorite
up vote
4
down vote
favorite
My professor always writes on the board:
$A$ is $m times n$, assuming that the vectors of $A$ form a basis, then $A^TA$ is always invertible.
one thing I know is that $A^TA$ is always symmetric, but I'm not sure about the conditions on a symmetric matrix needed to ensure that it is invertible?
linear-algebra matrices inverse symmetric-matrices
My professor always writes on the board:
$A$ is $m times n$, assuming that the vectors of $A$ form a basis, then $A^TA$ is always invertible.
one thing I know is that $A^TA$ is always symmetric, but I'm not sure about the conditions on a symmetric matrix needed to ensure that it is invertible?
linear-algebra matrices inverse symmetric-matrices
edited Jul 11 '17 at 9:50
Widawensen
4,23821344
4,23821344
asked Jul 9 '17 at 17:36
nundo
957
957
3
What do you nean by "the vectors for $A$"?
â Robert Lewis
Jul 9 '17 at 17:38
Sorry, should be "vectors of A" - it's my english.
â nundo
Jul 9 '17 at 17:49
1
There is a long list of conditions that are equivalent to the condition that a matrix is invertible, all of which are equally valid for symmetric matrices. Are you instead asking for conditions on a matrix $A$ to ensure that $A^TA$ is invertible?
â JMoravitz
Jul 9 '17 at 17:55
1
The words you need are "row" and "column." With $m$ rows and $n$ columns, we find $A^T A$ is a square of size $n.$ When $m geq n$ and the $n$ columns of $A$ are independent, then $A^T A$ is also of rank $n,$ therefore invertible.
â Will Jagy
Jul 9 '17 at 17:56
@JMoravitz yes, sorry perhaps mis-asked the question, thank you.
â nundo
Jul 10 '17 at 5:27
 |Â
show 4 more comments
3
What do you nean by "the vectors for $A$"?
â Robert Lewis
Jul 9 '17 at 17:38
Sorry, should be "vectors of A" - it's my english.
â nundo
Jul 9 '17 at 17:49
1
There is a long list of conditions that are equivalent to the condition that a matrix is invertible, all of which are equally valid for symmetric matrices. Are you instead asking for conditions on a matrix $A$ to ensure that $A^TA$ is invertible?
â JMoravitz
Jul 9 '17 at 17:55
1
The words you need are "row" and "column." With $m$ rows and $n$ columns, we find $A^T A$ is a square of size $n.$ When $m geq n$ and the $n$ columns of $A$ are independent, then $A^T A$ is also of rank $n,$ therefore invertible.
â Will Jagy
Jul 9 '17 at 17:56
@JMoravitz yes, sorry perhaps mis-asked the question, thank you.
â nundo
Jul 10 '17 at 5:27
3
3
What do you nean by "the vectors for $A$"?
â Robert Lewis
Jul 9 '17 at 17:38
What do you nean by "the vectors for $A$"?
â Robert Lewis
Jul 9 '17 at 17:38
Sorry, should be "vectors of A" - it's my english.
â nundo
Jul 9 '17 at 17:49
Sorry, should be "vectors of A" - it's my english.
â nundo
Jul 9 '17 at 17:49
1
1
There is a long list of conditions that are equivalent to the condition that a matrix is invertible, all of which are equally valid for symmetric matrices. Are you instead asking for conditions on a matrix $A$ to ensure that $A^TA$ is invertible?
â JMoravitz
Jul 9 '17 at 17:55
There is a long list of conditions that are equivalent to the condition that a matrix is invertible, all of which are equally valid for symmetric matrices. Are you instead asking for conditions on a matrix $A$ to ensure that $A^TA$ is invertible?
â JMoravitz
Jul 9 '17 at 17:55
1
1
The words you need are "row" and "column." With $m$ rows and $n$ columns, we find $A^T A$ is a square of size $n.$ When $m geq n$ and the $n$ columns of $A$ are independent, then $A^T A$ is also of rank $n,$ therefore invertible.
â Will Jagy
Jul 9 '17 at 17:56
The words you need are "row" and "column." With $m$ rows and $n$ columns, we find $A^T A$ is a square of size $n.$ When $m geq n$ and the $n$ columns of $A$ are independent, then $A^T A$ is also of rank $n,$ therefore invertible.
â Will Jagy
Jul 9 '17 at 17:56
@JMoravitz yes, sorry perhaps mis-asked the question, thank you.
â nundo
Jul 10 '17 at 5:27
@JMoravitz yes, sorry perhaps mis-asked the question, thank you.
â nundo
Jul 10 '17 at 5:27
 |Â
show 4 more comments
2 Answers
2
active
oldest
votes
up vote
0
down vote
@RobertLewis
A Gram matrix is usually defined by giving a set of vectors and then defining the i,j entry as the dot product of the i,j vectors. In doing so, clearly the set of vectors can be thought of as column vectors of A. So saying "the vectors for A" is a completely natural thing to say, and should be unambiguous.
here is an elegant proof
Gram matrix invertible iff set of vectors linearly independent
you seem to be saying that the ONLY condition needed on $A$ for $A^TA$ to be invertible is if the column vectors of $A$ are linearly independent. Please look at the example I gave to YvesDaoust: if A is composed of those two vectors I just mentioned $([0,1,0,0], [1,0,0,1])$ , then $A^TA$ is symmetric but not invertible. So, I think that your claim is not entirely true.
â nundo
Jul 17 '17 at 0:41
In your "counterexample", $A^TA = diag(1, 2)$ is most certainly invertible....
â Randy
Dec 19 '17 at 21:04
@Randy I think nundo probably meant to put a transpose on those two vectors, so that A is a 4x2 matrix, not 2x4.
â Confounded
May 10 at 17:49
add a comment |Â
up vote
0
down vote
A sufficient condition for a symmetric $ntimes n$ matrix $C$ to be invertible is that the matrix is positive definite, i.e.
$$forall xinmathbbR^nbackslash0, x^TCx>0.$$
We can use this observation to prove that $A^TA$ is invertible, because from the fact that the $n$ columns of $A$ are linear independent, we can prove that $A^T A$ is not only symmetric but also positive definite.
In fact, using Gram-Schmidt orthonormalization process, we can build a $ntimes n$ invertible matrix $Q$ such that the columns of $AQ$ are a family of $n$ orthonormal vectors, and then:
$$I_n=(AQ)^T (AQ)$$
where $I_n$ is the identity matrix of dimension $n$.
Get $xinmathbbR^nbackslash0$.
Then, from $Q^-1xneq 0$ it follows that $|Q^-1x|^2>0$ and so:
$$x^T(A^TA)x=x^T(AI_n)^T(AI_n)x=x^T(AQQ^-1)^T(AQQ^-1)x \ = x^T(Q^-1)^T(AQ)^T(AQ)(Q^-1x) = (Q^-1x)^Tleft((AQ)^T(AQ)right)(Q^-1x) \ = (Q^-1x)^TI_n(Q^-1x) = (Q^-1x)^T(Q^-1x) = |Q^-1x|^2>0.$$
Being $x$ arbitrary, it follows that:
$$forall xinmathbbR^nbackslash0, x^T(A^TA)x>0,$$
i.e. $A^TA$ is positive definite, and then invertible.
Why do you need to use Gram-Schmidt? You can argue directly that $x^T A^T Ax = (Ax)^T (Ax) = |Ax|^2$, and the RHS is strictly positive for all nonzero $x$ provided that $A$ has trivial null space (or equivalently, that $A$ has full column rank).
â Bungo
Aug 27 at 16:04
add a comment |Â
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
@RobertLewis
A Gram matrix is usually defined by giving a set of vectors and then defining the i,j entry as the dot product of the i,j vectors. In doing so, clearly the set of vectors can be thought of as column vectors of A. So saying "the vectors for A" is a completely natural thing to say, and should be unambiguous.
here is an elegant proof
Gram matrix invertible iff set of vectors linearly independent
you seem to be saying that the ONLY condition needed on $A$ for $A^TA$ to be invertible is if the column vectors of $A$ are linearly independent. Please look at the example I gave to YvesDaoust: if A is composed of those two vectors I just mentioned $([0,1,0,0], [1,0,0,1])$ , then $A^TA$ is symmetric but not invertible. So, I think that your claim is not entirely true.
â nundo
Jul 17 '17 at 0:41
In your "counterexample", $A^TA = diag(1, 2)$ is most certainly invertible....
â Randy
Dec 19 '17 at 21:04
@Randy I think nundo probably meant to put a transpose on those two vectors, so that A is a 4x2 matrix, not 2x4.
â Confounded
May 10 at 17:49
add a comment |Â
up vote
0
down vote
@RobertLewis
A Gram matrix is usually defined by giving a set of vectors and then defining the i,j entry as the dot product of the i,j vectors. In doing so, clearly the set of vectors can be thought of as column vectors of A. So saying "the vectors for A" is a completely natural thing to say, and should be unambiguous.
here is an elegant proof
Gram matrix invertible iff set of vectors linearly independent
you seem to be saying that the ONLY condition needed on $A$ for $A^TA$ to be invertible is if the column vectors of $A$ are linearly independent. Please look at the example I gave to YvesDaoust: if A is composed of those two vectors I just mentioned $([0,1,0,0], [1,0,0,1])$ , then $A^TA$ is symmetric but not invertible. So, I think that your claim is not entirely true.
â nundo
Jul 17 '17 at 0:41
In your "counterexample", $A^TA = diag(1, 2)$ is most certainly invertible....
â Randy
Dec 19 '17 at 21:04
@Randy I think nundo probably meant to put a transpose on those two vectors, so that A is a 4x2 matrix, not 2x4.
â Confounded
May 10 at 17:49
add a comment |Â
up vote
0
down vote
up vote
0
down vote
@RobertLewis
A Gram matrix is usually defined by giving a set of vectors and then defining the i,j entry as the dot product of the i,j vectors. In doing so, clearly the set of vectors can be thought of as column vectors of A. So saying "the vectors for A" is a completely natural thing to say, and should be unambiguous.
here is an elegant proof
Gram matrix invertible iff set of vectors linearly independent
@RobertLewis
A Gram matrix is usually defined by giving a set of vectors and then defining the i,j entry as the dot product of the i,j vectors. In doing so, clearly the set of vectors can be thought of as column vectors of A. So saying "the vectors for A" is a completely natural thing to say, and should be unambiguous.
here is an elegant proof
Gram matrix invertible iff set of vectors linearly independent
edited Jul 11 '17 at 15:44
answered Jul 11 '17 at 15:34
Randy
92
92
you seem to be saying that the ONLY condition needed on $A$ for $A^TA$ to be invertible is if the column vectors of $A$ are linearly independent. Please look at the example I gave to YvesDaoust: if A is composed of those two vectors I just mentioned $([0,1,0,0], [1,0,0,1])$ , then $A^TA$ is symmetric but not invertible. So, I think that your claim is not entirely true.
â nundo
Jul 17 '17 at 0:41
In your "counterexample", $A^TA = diag(1, 2)$ is most certainly invertible....
â Randy
Dec 19 '17 at 21:04
@Randy I think nundo probably meant to put a transpose on those two vectors, so that A is a 4x2 matrix, not 2x4.
â Confounded
May 10 at 17:49
add a comment |Â
you seem to be saying that the ONLY condition needed on $A$ for $A^TA$ to be invertible is if the column vectors of $A$ are linearly independent. Please look at the example I gave to YvesDaoust: if A is composed of those two vectors I just mentioned $([0,1,0,0], [1,0,0,1])$ , then $A^TA$ is symmetric but not invertible. So, I think that your claim is not entirely true.
â nundo
Jul 17 '17 at 0:41
In your "counterexample", $A^TA = diag(1, 2)$ is most certainly invertible....
â Randy
Dec 19 '17 at 21:04
@Randy I think nundo probably meant to put a transpose on those two vectors, so that A is a 4x2 matrix, not 2x4.
â Confounded
May 10 at 17:49
you seem to be saying that the ONLY condition needed on $A$ for $A^TA$ to be invertible is if the column vectors of $A$ are linearly independent. Please look at the example I gave to YvesDaoust: if A is composed of those two vectors I just mentioned $([0,1,0,0], [1,0,0,1])$ , then $A^TA$ is symmetric but not invertible. So, I think that your claim is not entirely true.
â nundo
Jul 17 '17 at 0:41
you seem to be saying that the ONLY condition needed on $A$ for $A^TA$ to be invertible is if the column vectors of $A$ are linearly independent. Please look at the example I gave to YvesDaoust: if A is composed of those two vectors I just mentioned $([0,1,0,0], [1,0,0,1])$ , then $A^TA$ is symmetric but not invertible. So, I think that your claim is not entirely true.
â nundo
Jul 17 '17 at 0:41
In your "counterexample", $A^TA = diag(1, 2)$ is most certainly invertible....
â Randy
Dec 19 '17 at 21:04
In your "counterexample", $A^TA = diag(1, 2)$ is most certainly invertible....
â Randy
Dec 19 '17 at 21:04
@Randy I think nundo probably meant to put a transpose on those two vectors, so that A is a 4x2 matrix, not 2x4.
â Confounded
May 10 at 17:49
@Randy I think nundo probably meant to put a transpose on those two vectors, so that A is a 4x2 matrix, not 2x4.
â Confounded
May 10 at 17:49
add a comment |Â
up vote
0
down vote
A sufficient condition for a symmetric $ntimes n$ matrix $C$ to be invertible is that the matrix is positive definite, i.e.
$$forall xinmathbbR^nbackslash0, x^TCx>0.$$
We can use this observation to prove that $A^TA$ is invertible, because from the fact that the $n$ columns of $A$ are linear independent, we can prove that $A^T A$ is not only symmetric but also positive definite.
In fact, using Gram-Schmidt orthonormalization process, we can build a $ntimes n$ invertible matrix $Q$ such that the columns of $AQ$ are a family of $n$ orthonormal vectors, and then:
$$I_n=(AQ)^T (AQ)$$
where $I_n$ is the identity matrix of dimension $n$.
Get $xinmathbbR^nbackslash0$.
Then, from $Q^-1xneq 0$ it follows that $|Q^-1x|^2>0$ and so:
$$x^T(A^TA)x=x^T(AI_n)^T(AI_n)x=x^T(AQQ^-1)^T(AQQ^-1)x \ = x^T(Q^-1)^T(AQ)^T(AQ)(Q^-1x) = (Q^-1x)^Tleft((AQ)^T(AQ)right)(Q^-1x) \ = (Q^-1x)^TI_n(Q^-1x) = (Q^-1x)^T(Q^-1x) = |Q^-1x|^2>0.$$
Being $x$ arbitrary, it follows that:
$$forall xinmathbbR^nbackslash0, x^T(A^TA)x>0,$$
i.e. $A^TA$ is positive definite, and then invertible.
Why do you need to use Gram-Schmidt? You can argue directly that $x^T A^T Ax = (Ax)^T (Ax) = |Ax|^2$, and the RHS is strictly positive for all nonzero $x$ provided that $A$ has trivial null space (or equivalently, that $A$ has full column rank).
â Bungo
Aug 27 at 16:04
add a comment |Â
up vote
0
down vote
A sufficient condition for a symmetric $ntimes n$ matrix $C$ to be invertible is that the matrix is positive definite, i.e.
$$forall xinmathbbR^nbackslash0, x^TCx>0.$$
We can use this observation to prove that $A^TA$ is invertible, because from the fact that the $n$ columns of $A$ are linear independent, we can prove that $A^T A$ is not only symmetric but also positive definite.
In fact, using Gram-Schmidt orthonormalization process, we can build a $ntimes n$ invertible matrix $Q$ such that the columns of $AQ$ are a family of $n$ orthonormal vectors, and then:
$$I_n=(AQ)^T (AQ)$$
where $I_n$ is the identity matrix of dimension $n$.
Get $xinmathbbR^nbackslash0$.
Then, from $Q^-1xneq 0$ it follows that $|Q^-1x|^2>0$ and so:
$$x^T(A^TA)x=x^T(AI_n)^T(AI_n)x=x^T(AQQ^-1)^T(AQQ^-1)x \ = x^T(Q^-1)^T(AQ)^T(AQ)(Q^-1x) = (Q^-1x)^Tleft((AQ)^T(AQ)right)(Q^-1x) \ = (Q^-1x)^TI_n(Q^-1x) = (Q^-1x)^T(Q^-1x) = |Q^-1x|^2>0.$$
Being $x$ arbitrary, it follows that:
$$forall xinmathbbR^nbackslash0, x^T(A^TA)x>0,$$
i.e. $A^TA$ is positive definite, and then invertible.
Why do you need to use Gram-Schmidt? You can argue directly that $x^T A^T Ax = (Ax)^T (Ax) = |Ax|^2$, and the RHS is strictly positive for all nonzero $x$ provided that $A$ has trivial null space (or equivalently, that $A$ has full column rank).
â Bungo
Aug 27 at 16:04
add a comment |Â
up vote
0
down vote
up vote
0
down vote
A sufficient condition for a symmetric $ntimes n$ matrix $C$ to be invertible is that the matrix is positive definite, i.e.
$$forall xinmathbbR^nbackslash0, x^TCx>0.$$
We can use this observation to prove that $A^TA$ is invertible, because from the fact that the $n$ columns of $A$ are linear independent, we can prove that $A^T A$ is not only symmetric but also positive definite.
In fact, using Gram-Schmidt orthonormalization process, we can build a $ntimes n$ invertible matrix $Q$ such that the columns of $AQ$ are a family of $n$ orthonormal vectors, and then:
$$I_n=(AQ)^T (AQ)$$
where $I_n$ is the identity matrix of dimension $n$.
Get $xinmathbbR^nbackslash0$.
Then, from $Q^-1xneq 0$ it follows that $|Q^-1x|^2>0$ and so:
$$x^T(A^TA)x=x^T(AI_n)^T(AI_n)x=x^T(AQQ^-1)^T(AQQ^-1)x \ = x^T(Q^-1)^T(AQ)^T(AQ)(Q^-1x) = (Q^-1x)^Tleft((AQ)^T(AQ)right)(Q^-1x) \ = (Q^-1x)^TI_n(Q^-1x) = (Q^-1x)^T(Q^-1x) = |Q^-1x|^2>0.$$
Being $x$ arbitrary, it follows that:
$$forall xinmathbbR^nbackslash0, x^T(A^TA)x>0,$$
i.e. $A^TA$ is positive definite, and then invertible.
A sufficient condition for a symmetric $ntimes n$ matrix $C$ to be invertible is that the matrix is positive definite, i.e.
$$forall xinmathbbR^nbackslash0, x^TCx>0.$$
We can use this observation to prove that $A^TA$ is invertible, because from the fact that the $n$ columns of $A$ are linear independent, we can prove that $A^T A$ is not only symmetric but also positive definite.
In fact, using Gram-Schmidt orthonormalization process, we can build a $ntimes n$ invertible matrix $Q$ such that the columns of $AQ$ are a family of $n$ orthonormal vectors, and then:
$$I_n=(AQ)^T (AQ)$$
where $I_n$ is the identity matrix of dimension $n$.
Get $xinmathbbR^nbackslash0$.
Then, from $Q^-1xneq 0$ it follows that $|Q^-1x|^2>0$ and so:
$$x^T(A^TA)x=x^T(AI_n)^T(AI_n)x=x^T(AQQ^-1)^T(AQQ^-1)x \ = x^T(Q^-1)^T(AQ)^T(AQ)(Q^-1x) = (Q^-1x)^Tleft((AQ)^T(AQ)right)(Q^-1x) \ = (Q^-1x)^TI_n(Q^-1x) = (Q^-1x)^T(Q^-1x) = |Q^-1x|^2>0.$$
Being $x$ arbitrary, it follows that:
$$forall xinmathbbR^nbackslash0, x^T(A^TA)x>0,$$
i.e. $A^TA$ is positive definite, and then invertible.
answered Jul 28 at 5:46
Bob
1,504522
1,504522
Why do you need to use Gram-Schmidt? You can argue directly that $x^T A^T Ax = (Ax)^T (Ax) = |Ax|^2$, and the RHS is strictly positive for all nonzero $x$ provided that $A$ has trivial null space (or equivalently, that $A$ has full column rank).
â Bungo
Aug 27 at 16:04
add a comment |Â
Why do you need to use Gram-Schmidt? You can argue directly that $x^T A^T Ax = (Ax)^T (Ax) = |Ax|^2$, and the RHS is strictly positive for all nonzero $x$ provided that $A$ has trivial null space (or equivalently, that $A$ has full column rank).
â Bungo
Aug 27 at 16:04
Why do you need to use Gram-Schmidt? You can argue directly that $x^T A^T Ax = (Ax)^T (Ax) = |Ax|^2$, and the RHS is strictly positive for all nonzero $x$ provided that $A$ has trivial null space (or equivalently, that $A$ has full column rank).
â Bungo
Aug 27 at 16:04
Why do you need to use Gram-Schmidt? You can argue directly that $x^T A^T Ax = (Ax)^T (Ax) = |Ax|^2$, and the RHS is strictly positive for all nonzero $x$ provided that $A$ has trivial null space (or equivalently, that $A$ has full column rank).
â Bungo
Aug 27 at 16:04
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2352684%2fwhen-is-a-symmetric-matrix-invertible%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
3
What do you nean by "the vectors for $A$"?
â Robert Lewis
Jul 9 '17 at 17:38
Sorry, should be "vectors of A" - it's my english.
â nundo
Jul 9 '17 at 17:49
1
There is a long list of conditions that are equivalent to the condition that a matrix is invertible, all of which are equally valid for symmetric matrices. Are you instead asking for conditions on a matrix $A$ to ensure that $A^TA$ is invertible?
â JMoravitz
Jul 9 '17 at 17:55
1
The words you need are "row" and "column." With $m$ rows and $n$ columns, we find $A^T A$ is a square of size $n.$ When $m geq n$ and the $n$ columns of $A$ are independent, then $A^T A$ is also of rank $n,$ therefore invertible.
â Will Jagy
Jul 9 '17 at 17:56
@JMoravitz yes, sorry perhaps mis-asked the question, thank you.
â nundo
Jul 10 '17 at 5:27