Time evolution of a Gaussian wave packet, why convert to k-space? Announcing the arrival of...
Multi tool use
What is best way to wire a ceiling receptacle in this situation?
How much damage would a cupful of neutron star matter do to the Earth?
As Singapore Airlines (Krisflyer) Gold, can I bring my family into the lounge on a domestic Virgin Australia flight?
Sum letters are not two different
Crossing US/Canada Border for less than 24 hours
What is the meaning of 'breadth' in breadth first search?
Why weren't discrete x86 CPUs ever used in game hardware?
How does light 'choose' between wave and particle behaviour?
What to do with repeated rejections for phd position
Putting class ranking in CV, but against dept guidelines
Why are vacuum tubes still used in amateur radios?
How could we fake a moon landing now?
What initially awakened the Balrog?
What is an "asse" in Elizabethan English?
How does the math work when buying airline miles?
How to save space when writing equations with cases?
How do I find out the mythology and history of my Fortress?
Did Mueller's report provide an evidentiary basis for the claim of Russian govt election interference via social media?
Central Vacuuming: Is it worth it, and how does it compare to normal vacuuming?
How can I set the aperture on my DSLR when it's attached to a telescope instead of a lens?
Project Euler #1 in C++
Would it be easier to apply for a UK visa if there is a host family to sponsor for you in going there?
Getting prompted for verification code but where do I put it in?
What do you call the main part of a joke?
Time evolution of a Gaussian wave packet, why convert to k-space?
Announcing the arrival of Valued Associate #679: Cesar Manara
Planned maintenance scheduled April 23, 2019 at 23:30 UTC (7:30pm US/Eastern)
2019 Moderator Election Q&A - Question CollectionWhy do we consider the evolution (usually in time) of a wave function?Time evolution of Gaussian wave packetTime evolution in Quantum Mechanics abstract state spaceTime reversal symmetry in the Schrodinger equation and evolution of wave packetIs the free particle Gaussian wavepacket in the Schwartz space?Convert time operator from momentum space to position spaceTransient simulation of a Gaussian wave packet using time dependent Liouville-von Neumann equation in center mass coordinates?Time evolution of squeezed statesHow big will the wave packet be?How to propagate Heller's model of the Gaussian Wave Packet?
$begingroup$
I'm trying to do a homework problem where I'm time evolving a Gaussian wave packet with a Hamiltonian of $ frac{p^{2}}{2m} $
So if I have a Gaussian wave packet given by:
$$ Psi(x) = Ae^{-alpha x^{2}} , .$$
I want to time evolve it, my first instinct would be to just tack on the time evolution term of $e^{-frac{iEt}{hbar}}$.
However, in the solution it tells me that this is incorrect, and I first need to convert the wave function into k-space by using a Fourier transform due to the Hamiltonian being $ p^2/2m$. Can anyone tell me why I need to convert it to k-space first? In a finite well example with the same Hamiltonian we can just multiply the time evolution term to each term of the wave function. Why can't we can't do that to a Gaussian wave packet?
quantum-mechanics homework-and-exercises
New contributor
$endgroup$
add a comment |
$begingroup$
I'm trying to do a homework problem where I'm time evolving a Gaussian wave packet with a Hamiltonian of $ frac{p^{2}}{2m} $
So if I have a Gaussian wave packet given by:
$$ Psi(x) = Ae^{-alpha x^{2}} , .$$
I want to time evolve it, my first instinct would be to just tack on the time evolution term of $e^{-frac{iEt}{hbar}}$.
However, in the solution it tells me that this is incorrect, and I first need to convert the wave function into k-space by using a Fourier transform due to the Hamiltonian being $ p^2/2m$. Can anyone tell me why I need to convert it to k-space first? In a finite well example with the same Hamiltonian we can just multiply the time evolution term to each term of the wave function. Why can't we can't do that to a Gaussian wave packet?
quantum-mechanics homework-and-exercises
New contributor
$endgroup$
1
$begingroup$
Ask yourself this: why do you think you can tack on the time dependence? What reason do you have to think that's correct?
$endgroup$
– DanielSank
3 hours ago
1
$begingroup$
And, more importantly, what value of the energy would you choose? Is your state an eigenstate of the hamiltonian, with a well-defined energy?
$endgroup$
– Emilio Pisanty
3 hours ago
add a comment |
$begingroup$
I'm trying to do a homework problem where I'm time evolving a Gaussian wave packet with a Hamiltonian of $ frac{p^{2}}{2m} $
So if I have a Gaussian wave packet given by:
$$ Psi(x) = Ae^{-alpha x^{2}} , .$$
I want to time evolve it, my first instinct would be to just tack on the time evolution term of $e^{-frac{iEt}{hbar}}$.
However, in the solution it tells me that this is incorrect, and I first need to convert the wave function into k-space by using a Fourier transform due to the Hamiltonian being $ p^2/2m$. Can anyone tell me why I need to convert it to k-space first? In a finite well example with the same Hamiltonian we can just multiply the time evolution term to each term of the wave function. Why can't we can't do that to a Gaussian wave packet?
quantum-mechanics homework-and-exercises
New contributor
$endgroup$
I'm trying to do a homework problem where I'm time evolving a Gaussian wave packet with a Hamiltonian of $ frac{p^{2}}{2m} $
So if I have a Gaussian wave packet given by:
$$ Psi(x) = Ae^{-alpha x^{2}} , .$$
I want to time evolve it, my first instinct would be to just tack on the time evolution term of $e^{-frac{iEt}{hbar}}$.
However, in the solution it tells me that this is incorrect, and I first need to convert the wave function into k-space by using a Fourier transform due to the Hamiltonian being $ p^2/2m$. Can anyone tell me why I need to convert it to k-space first? In a finite well example with the same Hamiltonian we can just multiply the time evolution term to each term of the wave function. Why can't we can't do that to a Gaussian wave packet?
quantum-mechanics homework-and-exercises
quantum-mechanics homework-and-exercises
New contributor
New contributor
edited 3 hours ago
DanielSank
17.9k45178
17.9k45178
New contributor
asked 4 hours ago
M-BM-B
182
182
New contributor
New contributor
1
$begingroup$
Ask yourself this: why do you think you can tack on the time dependence? What reason do you have to think that's correct?
$endgroup$
– DanielSank
3 hours ago
1
$begingroup$
And, more importantly, what value of the energy would you choose? Is your state an eigenstate of the hamiltonian, with a well-defined energy?
$endgroup$
– Emilio Pisanty
3 hours ago
add a comment |
1
$begingroup$
Ask yourself this: why do you think you can tack on the time dependence? What reason do you have to think that's correct?
$endgroup$
– DanielSank
3 hours ago
1
$begingroup$
And, more importantly, what value of the energy would you choose? Is your state an eigenstate of the hamiltonian, with a well-defined energy?
$endgroup$
– Emilio Pisanty
3 hours ago
1
1
$begingroup$
Ask yourself this: why do you think you can tack on the time dependence? What reason do you have to think that's correct?
$endgroup$
– DanielSank
3 hours ago
$begingroup$
Ask yourself this: why do you think you can tack on the time dependence? What reason do you have to think that's correct?
$endgroup$
– DanielSank
3 hours ago
1
1
$begingroup$
And, more importantly, what value of the energy would you choose? Is your state an eigenstate of the hamiltonian, with a well-defined energy?
$endgroup$
– Emilio Pisanty
3 hours ago
$begingroup$
And, more importantly, what value of the energy would you choose? Is your state an eigenstate of the hamiltonian, with a well-defined energy?
$endgroup$
– Emilio Pisanty
3 hours ago
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
Tacking on a term $e^{-iEt/hbar}$ is the correct interpretation of the Schrödinger equation $$ihbar |partial_t Psirangle = hat H |Psirangle$$only for those eigenstates for which $$hat H |Psirangle = E|Psirangle,$$as otherwise you do not know what value of $E$ should be used to substitute. Hypothetically you can still do it, but you pay a very painful cost that the $E$ is in fact a full-fledged operator and you therefore need to exponentiate an operator, which is nontrivial.
If this is all sounding a bit complicated, please remember that QM is just linear algebra in funny hats, and so you could get an intuition for similar systems by just using some matrices and vectors, for example looking at $$ihbar begin{bmatrix} f'(t) \ g'(t) end{bmatrix} = epsilon begin{bmatrix} 0&1\1&0end{bmatrix} begin{bmatrix} f(t) \ g(t)end{bmatrix}.$$One can in fact express this as $$begin{bmatrix}f(t)\g(t)end{bmatrix} = e^{-ihat H t/hbar} begin{bmatrix} f_0\ g_0end{bmatrix},$$ but one has to exponentiate this matrix. That is not hard because it squares to the identity matrix, causing a simple expansion, $$begin{bmatrix}f(t)\g(t)end{bmatrix} = cos(epsilon t/hbar) begin{bmatrix} f_0\ g_0end{bmatrix} - i sin(epsilon t/hbar) begin{bmatrix} g_0\ f_0end{bmatrix}. $$ One can then confirm that indeed this satisfies the Schrödinger equation given above. One can also immediately see that this does not directly have the form $e^{-iepsilon t/hbar} [f_0; g_0],$ but how could it? That would be a different Hamiltonian $hat H = epsilon I.$
But, with some creativity, one can see that if $f_0 = g_0$ those two remaining vectors would be parallel, or if $f_0 = -g_0$, and one can indeed rewrite this solution in terms of those eigenvectors of the original $hat H$ as $$begin{bmatrix}f(t)\g(t)end{bmatrix} = e^{-iepsilon t/hbar} alpha begin{bmatrix} 1\ 1end{bmatrix} + e^{iepsilon t/hbar} beta begin{bmatrix} -1\ 1end{bmatrix}. $$ So the trick to more easily finding general solutions is to find these eigenvectors first and then form a general linear combination of those eigenvectors once they have been multiplied individually by their time dependence. Then for a given initial state, we need to find the $alpha$ and $beta$ terms: in this case it is simple enough by looking at $t=0$ where $alpha - beta = f_0$ while $alpha + beta = g_0.$
Similarly for your Hamiltonian $hat H = hat p^2/(2m) = -frac{hbar^2}{2m}frac{partial^2~}{partial x^2},$ you know that the eigenvectors are plane waves, $$phi_k(x) = e^{ikx}.$$You know that you can then add time dependence to them in the obvious way, $$Phi_k(x, t) = e^{i(k x - omega_k t)},$$ where of course $$hbar omega_k = frac{hbar^2k^2}{2m}.$$ So the eigenvector story is just beautifully simple for you to do, all you need is the ability to take derivatives of exponentials.
The part of the story that is more complicated is assembling an arbitrary $psi(x)$ as a sum of these exponentials. However while it is complicated it is not impossible: you know from Fourier's theorem that $$psi(x) = frac{1}{2pi}int_{-infty}^{infty} dk ~e^{i k x} int_{-infty}^infty dxi ~e^{-ikxi} ~psi(xi).$$ Let your eyes glaze over the second integral and see it as just what it is, some $psi[k]$ coefficent in $k$-space. What we have here then is a sum—a continuous sum, but still a sum!—of coefficients times eigenfunctions:$$psi(x) = int_{-infty}^{infty}frac{dk~psi[k]}{2pi}~phi_k(x).$$
And we know how to Schrödinger-ize such a sum, we just add $e^{-iomega_k t}$ terms to each of the eigenfunctions, turning $phi_k$ into $Phi_k.$ So we get,
$$Psi(x, t) = frac{1}{2pi}int_{-infty}^{infty} dk ~e^{i (k x - omega_k t)} ~psi[k].$$
You do not have to do it this way, you can try to do some sort of $$expleft[-i frac{hbar t}{2m} frac{partial^2~}{partial x^2}right] e^{-a x^2}$$
monstrosity, expanding the operator in a power series and then seeing whether there are patterns you can use among the $n^text{th}$ derivatives of Gaussians to simplify. But the operator expansion way looks really pretty difficult, while the eigenvector way is really easy.
The reason it is really easy is that both $hat H$ and $ihbar partial_t$ are linear operators: they distribute over sums. So if you are still feeling queasy about this procedure, convince yourself by just writing it out: calculate this value $$0 = left(ihbar frac{partial~}{partial t} + frac{hbar^2}{2m}frac{partial^2~}{partial x^2}right) frac{1}{2pi} int_{-infty}^infty dk~psi[k] ~e^{i (k x - omega_k t)}.$$ Notice that it holds with pretty much no restriction on the actual form of $psi[k]$ so that you only need to choose coefficients $psi[k]$ such that $Psi(x, 0) = psi(x).$
$endgroup$
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "151"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
M-B is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fphysics.stackexchange.com%2fquestions%2f473865%2ftime-evolution-of-a-gaussian-wave-packet-why-convert-to-k-space%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Tacking on a term $e^{-iEt/hbar}$ is the correct interpretation of the Schrödinger equation $$ihbar |partial_t Psirangle = hat H |Psirangle$$only for those eigenstates for which $$hat H |Psirangle = E|Psirangle,$$as otherwise you do not know what value of $E$ should be used to substitute. Hypothetically you can still do it, but you pay a very painful cost that the $E$ is in fact a full-fledged operator and you therefore need to exponentiate an operator, which is nontrivial.
If this is all sounding a bit complicated, please remember that QM is just linear algebra in funny hats, and so you could get an intuition for similar systems by just using some matrices and vectors, for example looking at $$ihbar begin{bmatrix} f'(t) \ g'(t) end{bmatrix} = epsilon begin{bmatrix} 0&1\1&0end{bmatrix} begin{bmatrix} f(t) \ g(t)end{bmatrix}.$$One can in fact express this as $$begin{bmatrix}f(t)\g(t)end{bmatrix} = e^{-ihat H t/hbar} begin{bmatrix} f_0\ g_0end{bmatrix},$$ but one has to exponentiate this matrix. That is not hard because it squares to the identity matrix, causing a simple expansion, $$begin{bmatrix}f(t)\g(t)end{bmatrix} = cos(epsilon t/hbar) begin{bmatrix} f_0\ g_0end{bmatrix} - i sin(epsilon t/hbar) begin{bmatrix} g_0\ f_0end{bmatrix}. $$ One can then confirm that indeed this satisfies the Schrödinger equation given above. One can also immediately see that this does not directly have the form $e^{-iepsilon t/hbar} [f_0; g_0],$ but how could it? That would be a different Hamiltonian $hat H = epsilon I.$
But, with some creativity, one can see that if $f_0 = g_0$ those two remaining vectors would be parallel, or if $f_0 = -g_0$, and one can indeed rewrite this solution in terms of those eigenvectors of the original $hat H$ as $$begin{bmatrix}f(t)\g(t)end{bmatrix} = e^{-iepsilon t/hbar} alpha begin{bmatrix} 1\ 1end{bmatrix} + e^{iepsilon t/hbar} beta begin{bmatrix} -1\ 1end{bmatrix}. $$ So the trick to more easily finding general solutions is to find these eigenvectors first and then form a general linear combination of those eigenvectors once they have been multiplied individually by their time dependence. Then for a given initial state, we need to find the $alpha$ and $beta$ terms: in this case it is simple enough by looking at $t=0$ where $alpha - beta = f_0$ while $alpha + beta = g_0.$
Similarly for your Hamiltonian $hat H = hat p^2/(2m) = -frac{hbar^2}{2m}frac{partial^2~}{partial x^2},$ you know that the eigenvectors are plane waves, $$phi_k(x) = e^{ikx}.$$You know that you can then add time dependence to them in the obvious way, $$Phi_k(x, t) = e^{i(k x - omega_k t)},$$ where of course $$hbar omega_k = frac{hbar^2k^2}{2m}.$$ So the eigenvector story is just beautifully simple for you to do, all you need is the ability to take derivatives of exponentials.
The part of the story that is more complicated is assembling an arbitrary $psi(x)$ as a sum of these exponentials. However while it is complicated it is not impossible: you know from Fourier's theorem that $$psi(x) = frac{1}{2pi}int_{-infty}^{infty} dk ~e^{i k x} int_{-infty}^infty dxi ~e^{-ikxi} ~psi(xi).$$ Let your eyes glaze over the second integral and see it as just what it is, some $psi[k]$ coefficent in $k$-space. What we have here then is a sum—a continuous sum, but still a sum!—of coefficients times eigenfunctions:$$psi(x) = int_{-infty}^{infty}frac{dk~psi[k]}{2pi}~phi_k(x).$$
And we know how to Schrödinger-ize such a sum, we just add $e^{-iomega_k t}$ terms to each of the eigenfunctions, turning $phi_k$ into $Phi_k.$ So we get,
$$Psi(x, t) = frac{1}{2pi}int_{-infty}^{infty} dk ~e^{i (k x - omega_k t)} ~psi[k].$$
You do not have to do it this way, you can try to do some sort of $$expleft[-i frac{hbar t}{2m} frac{partial^2~}{partial x^2}right] e^{-a x^2}$$
monstrosity, expanding the operator in a power series and then seeing whether there are patterns you can use among the $n^text{th}$ derivatives of Gaussians to simplify. But the operator expansion way looks really pretty difficult, while the eigenvector way is really easy.
The reason it is really easy is that both $hat H$ and $ihbar partial_t$ are linear operators: they distribute over sums. So if you are still feeling queasy about this procedure, convince yourself by just writing it out: calculate this value $$0 = left(ihbar frac{partial~}{partial t} + frac{hbar^2}{2m}frac{partial^2~}{partial x^2}right) frac{1}{2pi} int_{-infty}^infty dk~psi[k] ~e^{i (k x - omega_k t)}.$$ Notice that it holds with pretty much no restriction on the actual form of $psi[k]$ so that you only need to choose coefficients $psi[k]$ such that $Psi(x, 0) = psi(x).$
$endgroup$
add a comment |
$begingroup$
Tacking on a term $e^{-iEt/hbar}$ is the correct interpretation of the Schrödinger equation $$ihbar |partial_t Psirangle = hat H |Psirangle$$only for those eigenstates for which $$hat H |Psirangle = E|Psirangle,$$as otherwise you do not know what value of $E$ should be used to substitute. Hypothetically you can still do it, but you pay a very painful cost that the $E$ is in fact a full-fledged operator and you therefore need to exponentiate an operator, which is nontrivial.
If this is all sounding a bit complicated, please remember that QM is just linear algebra in funny hats, and so you could get an intuition for similar systems by just using some matrices and vectors, for example looking at $$ihbar begin{bmatrix} f'(t) \ g'(t) end{bmatrix} = epsilon begin{bmatrix} 0&1\1&0end{bmatrix} begin{bmatrix} f(t) \ g(t)end{bmatrix}.$$One can in fact express this as $$begin{bmatrix}f(t)\g(t)end{bmatrix} = e^{-ihat H t/hbar} begin{bmatrix} f_0\ g_0end{bmatrix},$$ but one has to exponentiate this matrix. That is not hard because it squares to the identity matrix, causing a simple expansion, $$begin{bmatrix}f(t)\g(t)end{bmatrix} = cos(epsilon t/hbar) begin{bmatrix} f_0\ g_0end{bmatrix} - i sin(epsilon t/hbar) begin{bmatrix} g_0\ f_0end{bmatrix}. $$ One can then confirm that indeed this satisfies the Schrödinger equation given above. One can also immediately see that this does not directly have the form $e^{-iepsilon t/hbar} [f_0; g_0],$ but how could it? That would be a different Hamiltonian $hat H = epsilon I.$
But, with some creativity, one can see that if $f_0 = g_0$ those two remaining vectors would be parallel, or if $f_0 = -g_0$, and one can indeed rewrite this solution in terms of those eigenvectors of the original $hat H$ as $$begin{bmatrix}f(t)\g(t)end{bmatrix} = e^{-iepsilon t/hbar} alpha begin{bmatrix} 1\ 1end{bmatrix} + e^{iepsilon t/hbar} beta begin{bmatrix} -1\ 1end{bmatrix}. $$ So the trick to more easily finding general solutions is to find these eigenvectors first and then form a general linear combination of those eigenvectors once they have been multiplied individually by their time dependence. Then for a given initial state, we need to find the $alpha$ and $beta$ terms: in this case it is simple enough by looking at $t=0$ where $alpha - beta = f_0$ while $alpha + beta = g_0.$
Similarly for your Hamiltonian $hat H = hat p^2/(2m) = -frac{hbar^2}{2m}frac{partial^2~}{partial x^2},$ you know that the eigenvectors are plane waves, $$phi_k(x) = e^{ikx}.$$You know that you can then add time dependence to them in the obvious way, $$Phi_k(x, t) = e^{i(k x - omega_k t)},$$ where of course $$hbar omega_k = frac{hbar^2k^2}{2m}.$$ So the eigenvector story is just beautifully simple for you to do, all you need is the ability to take derivatives of exponentials.
The part of the story that is more complicated is assembling an arbitrary $psi(x)$ as a sum of these exponentials. However while it is complicated it is not impossible: you know from Fourier's theorem that $$psi(x) = frac{1}{2pi}int_{-infty}^{infty} dk ~e^{i k x} int_{-infty}^infty dxi ~e^{-ikxi} ~psi(xi).$$ Let your eyes glaze over the second integral and see it as just what it is, some $psi[k]$ coefficent in $k$-space. What we have here then is a sum—a continuous sum, but still a sum!—of coefficients times eigenfunctions:$$psi(x) = int_{-infty}^{infty}frac{dk~psi[k]}{2pi}~phi_k(x).$$
And we know how to Schrödinger-ize such a sum, we just add $e^{-iomega_k t}$ terms to each of the eigenfunctions, turning $phi_k$ into $Phi_k.$ So we get,
$$Psi(x, t) = frac{1}{2pi}int_{-infty}^{infty} dk ~e^{i (k x - omega_k t)} ~psi[k].$$
You do not have to do it this way, you can try to do some sort of $$expleft[-i frac{hbar t}{2m} frac{partial^2~}{partial x^2}right] e^{-a x^2}$$
monstrosity, expanding the operator in a power series and then seeing whether there are patterns you can use among the $n^text{th}$ derivatives of Gaussians to simplify. But the operator expansion way looks really pretty difficult, while the eigenvector way is really easy.
The reason it is really easy is that both $hat H$ and $ihbar partial_t$ are linear operators: they distribute over sums. So if you are still feeling queasy about this procedure, convince yourself by just writing it out: calculate this value $$0 = left(ihbar frac{partial~}{partial t} + frac{hbar^2}{2m}frac{partial^2~}{partial x^2}right) frac{1}{2pi} int_{-infty}^infty dk~psi[k] ~e^{i (k x - omega_k t)}.$$ Notice that it holds with pretty much no restriction on the actual form of $psi[k]$ so that you only need to choose coefficients $psi[k]$ such that $Psi(x, 0) = psi(x).$
$endgroup$
add a comment |
$begingroup$
Tacking on a term $e^{-iEt/hbar}$ is the correct interpretation of the Schrödinger equation $$ihbar |partial_t Psirangle = hat H |Psirangle$$only for those eigenstates for which $$hat H |Psirangle = E|Psirangle,$$as otherwise you do not know what value of $E$ should be used to substitute. Hypothetically you can still do it, but you pay a very painful cost that the $E$ is in fact a full-fledged operator and you therefore need to exponentiate an operator, which is nontrivial.
If this is all sounding a bit complicated, please remember that QM is just linear algebra in funny hats, and so you could get an intuition for similar systems by just using some matrices and vectors, for example looking at $$ihbar begin{bmatrix} f'(t) \ g'(t) end{bmatrix} = epsilon begin{bmatrix} 0&1\1&0end{bmatrix} begin{bmatrix} f(t) \ g(t)end{bmatrix}.$$One can in fact express this as $$begin{bmatrix}f(t)\g(t)end{bmatrix} = e^{-ihat H t/hbar} begin{bmatrix} f_0\ g_0end{bmatrix},$$ but one has to exponentiate this matrix. That is not hard because it squares to the identity matrix, causing a simple expansion, $$begin{bmatrix}f(t)\g(t)end{bmatrix} = cos(epsilon t/hbar) begin{bmatrix} f_0\ g_0end{bmatrix} - i sin(epsilon t/hbar) begin{bmatrix} g_0\ f_0end{bmatrix}. $$ One can then confirm that indeed this satisfies the Schrödinger equation given above. One can also immediately see that this does not directly have the form $e^{-iepsilon t/hbar} [f_0; g_0],$ but how could it? That would be a different Hamiltonian $hat H = epsilon I.$
But, with some creativity, one can see that if $f_0 = g_0$ those two remaining vectors would be parallel, or if $f_0 = -g_0$, and one can indeed rewrite this solution in terms of those eigenvectors of the original $hat H$ as $$begin{bmatrix}f(t)\g(t)end{bmatrix} = e^{-iepsilon t/hbar} alpha begin{bmatrix} 1\ 1end{bmatrix} + e^{iepsilon t/hbar} beta begin{bmatrix} -1\ 1end{bmatrix}. $$ So the trick to more easily finding general solutions is to find these eigenvectors first and then form a general linear combination of those eigenvectors once they have been multiplied individually by their time dependence. Then for a given initial state, we need to find the $alpha$ and $beta$ terms: in this case it is simple enough by looking at $t=0$ where $alpha - beta = f_0$ while $alpha + beta = g_0.$
Similarly for your Hamiltonian $hat H = hat p^2/(2m) = -frac{hbar^2}{2m}frac{partial^2~}{partial x^2},$ you know that the eigenvectors are plane waves, $$phi_k(x) = e^{ikx}.$$You know that you can then add time dependence to them in the obvious way, $$Phi_k(x, t) = e^{i(k x - omega_k t)},$$ where of course $$hbar omega_k = frac{hbar^2k^2}{2m}.$$ So the eigenvector story is just beautifully simple for you to do, all you need is the ability to take derivatives of exponentials.
The part of the story that is more complicated is assembling an arbitrary $psi(x)$ as a sum of these exponentials. However while it is complicated it is not impossible: you know from Fourier's theorem that $$psi(x) = frac{1}{2pi}int_{-infty}^{infty} dk ~e^{i k x} int_{-infty}^infty dxi ~e^{-ikxi} ~psi(xi).$$ Let your eyes glaze over the second integral and see it as just what it is, some $psi[k]$ coefficent in $k$-space. What we have here then is a sum—a continuous sum, but still a sum!—of coefficients times eigenfunctions:$$psi(x) = int_{-infty}^{infty}frac{dk~psi[k]}{2pi}~phi_k(x).$$
And we know how to Schrödinger-ize such a sum, we just add $e^{-iomega_k t}$ terms to each of the eigenfunctions, turning $phi_k$ into $Phi_k.$ So we get,
$$Psi(x, t) = frac{1}{2pi}int_{-infty}^{infty} dk ~e^{i (k x - omega_k t)} ~psi[k].$$
You do not have to do it this way, you can try to do some sort of $$expleft[-i frac{hbar t}{2m} frac{partial^2~}{partial x^2}right] e^{-a x^2}$$
monstrosity, expanding the operator in a power series and then seeing whether there are patterns you can use among the $n^text{th}$ derivatives of Gaussians to simplify. But the operator expansion way looks really pretty difficult, while the eigenvector way is really easy.
The reason it is really easy is that both $hat H$ and $ihbar partial_t$ are linear operators: they distribute over sums. So if you are still feeling queasy about this procedure, convince yourself by just writing it out: calculate this value $$0 = left(ihbar frac{partial~}{partial t} + frac{hbar^2}{2m}frac{partial^2~}{partial x^2}right) frac{1}{2pi} int_{-infty}^infty dk~psi[k] ~e^{i (k x - omega_k t)}.$$ Notice that it holds with pretty much no restriction on the actual form of $psi[k]$ so that you only need to choose coefficients $psi[k]$ such that $Psi(x, 0) = psi(x).$
$endgroup$
Tacking on a term $e^{-iEt/hbar}$ is the correct interpretation of the Schrödinger equation $$ihbar |partial_t Psirangle = hat H |Psirangle$$only for those eigenstates for which $$hat H |Psirangle = E|Psirangle,$$as otherwise you do not know what value of $E$ should be used to substitute. Hypothetically you can still do it, but you pay a very painful cost that the $E$ is in fact a full-fledged operator and you therefore need to exponentiate an operator, which is nontrivial.
If this is all sounding a bit complicated, please remember that QM is just linear algebra in funny hats, and so you could get an intuition for similar systems by just using some matrices and vectors, for example looking at $$ihbar begin{bmatrix} f'(t) \ g'(t) end{bmatrix} = epsilon begin{bmatrix} 0&1\1&0end{bmatrix} begin{bmatrix} f(t) \ g(t)end{bmatrix}.$$One can in fact express this as $$begin{bmatrix}f(t)\g(t)end{bmatrix} = e^{-ihat H t/hbar} begin{bmatrix} f_0\ g_0end{bmatrix},$$ but one has to exponentiate this matrix. That is not hard because it squares to the identity matrix, causing a simple expansion, $$begin{bmatrix}f(t)\g(t)end{bmatrix} = cos(epsilon t/hbar) begin{bmatrix} f_0\ g_0end{bmatrix} - i sin(epsilon t/hbar) begin{bmatrix} g_0\ f_0end{bmatrix}. $$ One can then confirm that indeed this satisfies the Schrödinger equation given above. One can also immediately see that this does not directly have the form $e^{-iepsilon t/hbar} [f_0; g_0],$ but how could it? That would be a different Hamiltonian $hat H = epsilon I.$
But, with some creativity, one can see that if $f_0 = g_0$ those two remaining vectors would be parallel, or if $f_0 = -g_0$, and one can indeed rewrite this solution in terms of those eigenvectors of the original $hat H$ as $$begin{bmatrix}f(t)\g(t)end{bmatrix} = e^{-iepsilon t/hbar} alpha begin{bmatrix} 1\ 1end{bmatrix} + e^{iepsilon t/hbar} beta begin{bmatrix} -1\ 1end{bmatrix}. $$ So the trick to more easily finding general solutions is to find these eigenvectors first and then form a general linear combination of those eigenvectors once they have been multiplied individually by their time dependence. Then for a given initial state, we need to find the $alpha$ and $beta$ terms: in this case it is simple enough by looking at $t=0$ where $alpha - beta = f_0$ while $alpha + beta = g_0.$
Similarly for your Hamiltonian $hat H = hat p^2/(2m) = -frac{hbar^2}{2m}frac{partial^2~}{partial x^2},$ you know that the eigenvectors are plane waves, $$phi_k(x) = e^{ikx}.$$You know that you can then add time dependence to them in the obvious way, $$Phi_k(x, t) = e^{i(k x - omega_k t)},$$ where of course $$hbar omega_k = frac{hbar^2k^2}{2m}.$$ So the eigenvector story is just beautifully simple for you to do, all you need is the ability to take derivatives of exponentials.
The part of the story that is more complicated is assembling an arbitrary $psi(x)$ as a sum of these exponentials. However while it is complicated it is not impossible: you know from Fourier's theorem that $$psi(x) = frac{1}{2pi}int_{-infty}^{infty} dk ~e^{i k x} int_{-infty}^infty dxi ~e^{-ikxi} ~psi(xi).$$ Let your eyes glaze over the second integral and see it as just what it is, some $psi[k]$ coefficent in $k$-space. What we have here then is a sum—a continuous sum, but still a sum!—of coefficients times eigenfunctions:$$psi(x) = int_{-infty}^{infty}frac{dk~psi[k]}{2pi}~phi_k(x).$$
And we know how to Schrödinger-ize such a sum, we just add $e^{-iomega_k t}$ terms to each of the eigenfunctions, turning $phi_k$ into $Phi_k.$ So we get,
$$Psi(x, t) = frac{1}{2pi}int_{-infty}^{infty} dk ~e^{i (k x - omega_k t)} ~psi[k].$$
You do not have to do it this way, you can try to do some sort of $$expleft[-i frac{hbar t}{2m} frac{partial^2~}{partial x^2}right] e^{-a x^2}$$
monstrosity, expanding the operator in a power series and then seeing whether there are patterns you can use among the $n^text{th}$ derivatives of Gaussians to simplify. But the operator expansion way looks really pretty difficult, while the eigenvector way is really easy.
The reason it is really easy is that both $hat H$ and $ihbar partial_t$ are linear operators: they distribute over sums. So if you are still feeling queasy about this procedure, convince yourself by just writing it out: calculate this value $$0 = left(ihbar frac{partial~}{partial t} + frac{hbar^2}{2m}frac{partial^2~}{partial x^2}right) frac{1}{2pi} int_{-infty}^infty dk~psi[k] ~e^{i (k x - omega_k t)}.$$ Notice that it holds with pretty much no restriction on the actual form of $psi[k]$ so that you only need to choose coefficients $psi[k]$ such that $Psi(x, 0) = psi(x).$
edited 41 mins ago
answered 2 hours ago
CR DrostCR Drost
23.2k11964
23.2k11964
add a comment |
add a comment |
M-B is a new contributor. Be nice, and check out our Code of Conduct.
M-B is a new contributor. Be nice, and check out our Code of Conduct.
M-B is a new contributor. Be nice, and check out our Code of Conduct.
M-B is a new contributor. Be nice, and check out our Code of Conduct.
Thanks for contributing an answer to Physics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fphysics.stackexchange.com%2fquestions%2f473865%2ftime-evolution-of-a-gaussian-wave-packet-why-convert-to-k-space%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
zM2cM2J0r0ZgGIL,sXYS,3goAPhGd dp0PMkKybL7ZV
1
$begingroup$
Ask yourself this: why do you think you can tack on the time dependence? What reason do you have to think that's correct?
$endgroup$
– DanielSank
3 hours ago
1
$begingroup$
And, more importantly, what value of the energy would you choose? Is your state an eigenstate of the hamiltonian, with a well-defined energy?
$endgroup$
– Emilio Pisanty
3 hours ago