Hi,
we are trying to sign up under the PSF umbrella for this year's Google Summer of Code because of an e-mail from Ankit Mahato, who expressed interest to help developing SfePy as his GSoC project this summer.
So let us discuss possible project ideas here. I will post results of the discussion to [1].
Ankit's ideas are (my summary):
#1 parallelization - cluster support using mpi4py #2 pre- and post-processing GUI frontend #3 incorporating phase changing materials (his research area)
Ankit, could you post full text your ideas into this thread? The pdf you sent me does not allow selecting text.
My comments:
For me, #1 is something that I was planning to do "soon" anyway as I am going to need it for my research work - a help would come really handy, but we will have to think carefully about the implementation. I think I prefer having a parallel layer above the current serial FEM, so that the current code can stay as it is, unaware that it runs in parallel. I am not sure yet how difficult it is going to be, but it won't be trivial.
#2 would be nice, but IMHO it is not so important as having a solid and reasonably fast FEM core.
#3 would IMHO be the most useful for Ankit, and a nice addition to modelling capabilities of SfePy.
Other possible topics can be found in our issues list ("enhancement" label).
IMHO it would be good to prospective student(s) to try tackling some of the issues listed below to get acquainted with SfePy code before the GSoC starts:
#196 Document properly term evaluation modes and postprocessing/probing. #195 describe how to add Neumann BC in a diffusion example and tutorial (tutorial part done by Alec) #167 improve gallery page #164 Python 3 compatibility #154 automatic testing of terms #140 test schroedinger.py #133 Provide examples for SfePy Terms
Implementing the other enhancements would be, of course, also very useful, but those IMHO too difficult for someone trying to learn the code. They are certainly quite difficult for me, as they are not done yet =:) (shell elements!)
Cheers, r.
Hi everyone,
Here is the full text version of the Ideas as Robert requested:
#1 Parallelization
I went through the mailing list wherein it has been mentioned that SfePy can support multicore via numpy/scipy multicore support, but compute cluster support is not available which requires knowledge of MPI. So we can add computing cluster support where jobs need to communicate with each other and exploit the high performance computing in order to make it scalable. In Python it can be done using mpi4py module.
#2 Pre-processing and Post-Processing combined with SfePy
One of the reason why people use proprietary software is their ease to use. We can build complete simulation platform with powerful frontend for pre-processing to analysis to post-processing. This will include the script support wherein people can write scripts to post process the data on the platform itself. Also it will be made modular so as to make it extensible. We can look into integrating it with CAD packages like PythonCAD or built our own pre-processor or postprocessor using powerful GUI toolkit to provide the complete simulation solution.
#3 incorporating coupled equations for phase changing materials
Incorporating Phase changing material simulation which has never been done in any simulation software package before. Since it is my research area I will develop its model based on various research paper and my own research and encorporate it into SfePy. Regards, Ankit Mahato
On Tuesday, 19 March 2013 16:32:22 UTC+5:30, Robert Cimrman wrote:
Hi,
we are trying to sign up under the PSF umbrella for this year's Google Summer of Code because of an e-mail from Ankit Mahato, who expressed interest to help developing SfePy as his GSoC project this summer.
So let us discuss possible project ideas here. I will post results of the discussion to [1].
Ankit's ideas are (my summary):
#1 parallelization - cluster support using mpi4py #2 pre- and post-processing GUI frontend #3 incorporating phase changing materials (his research area)
Ankit, could you post full text your ideas into this thread? The pdf you sent me does not allow selecting text.
My comments:
For me, #1 is something that I was planning to do "soon" anyway as I am going to need it for my research work - a help would come really handy, but we will have to think carefully about the implementation. I think I prefer having a parallel layer above the current serial FEM, so that the current code can stay as it is, unaware that it runs in parallel. I am not sure yet how difficult it is going to be, but it won't be trivial.
#2 would be nice, but IMHO it is not so important as having a solid and reasonably fast FEM core.
#3 would IMHO be the most useful for Ankit, and a nice addition to modelling capabilities of SfePy.
Other possible topics can be found in our issues list ("enhancement" label).
IMHO it would be good to prospective student(s) to try tackling some of the issues listed below to get acquainted with SfePy code before the GSoC starts:
#196 Document properly term evaluation modes and postprocessing/probing. #195 describe how to add Neumann BC in a diffusion example and tutorial (tutorial part done by Alec) #167 improve gallery page #164 Python 3 compatibility #154 automatic testing of terms #140 test schroedinger.py #133 Provide examples for SfePy Terms
Implementing the other enhancements would be, of course, also very useful, but those IMHO too difficult for someone trying to learn the code. They are certainly quite difficult for me, as they are not done yet =:) (shell elements!)
Cheers, r.
Thanks!
SfePy is now listed among teams at [1]. The dates and deadlines are at the bottom of [2].
r.
[1] http://wiki.python.org/moin/SummerOfCode/2013 [2] https://www.google-melange.com/gsoc/events/google/gsoc2013
On 03/19/2013 12:17 PM, Ankit Mahato wrote:
Hi everyone,
Here is the full text version of the Ideas as Robert requested:
#1 Parallelization
I went through the mailing list wherein it has been mentioned that SfePy can support multicore via numpy/scipy multicore support, but compute cluster support is not available which requires knowledge of MPI. So we can add computing cluster support where jobs need to communicate with each other and exploit the high performance computing in order to make it scalable. In Python it can be done using mpi4py module.
#2 Pre-processing and Post-Processing combined with SfePy
One of the reason why people use proprietary software is their ease to use. We can build complete simulation platform with powerful frontend for pre-processing to analysis to post-processing. This will include the script support wherein people can write scripts to post process the data on the platform itself. Also it will be made modular so as to make it extensible. We can look into integrating it with CAD packages like PythonCAD or built our own pre-processor or postprocessor using powerful GUI toolkit to provide the complete simulation solution.
#3 incorporating coupled equations for phase changing materials
Incorporating Phase changing material simulation which has never been done in any simulation software package before. Since it is my research area I will develop its model based on various research paper and my own research and encorporate it into SfePy. Regards, Ankit Mahato
On Tuesday, 19 March 2013 16:32:22 UTC+5:30, Robert Cimrman wrote:
Hi,
we are trying to sign up under the PSF umbrella for this year's Google Summer of Code because of an e-mail from Ankit Mahato, who expressed interest to help developing SfePy as his GSoC project this summer.
So let us discuss possible project ideas here. I will post results of the discussion to [1].
Ankit's ideas are (my summary):
#1 parallelization - cluster support using mpi4py #2 pre- and post-processing GUI frontend #3 incorporating phase changing materials (his research area)
Ankit, could you post full text your ideas into this thread? The pdf you sent me does not allow selecting text.
My comments:
For me, #1 is something that I was planning to do "soon" anyway as I am going to need it for my research work - a help would come really handy, but we will have to think carefully about the implementation. I think I prefer having a parallel layer above the current serial FEM, so that the current code can stay as it is, unaware that it runs in parallel. I am not sure yet how difficult it is going to be, but it won't be trivial.
#2 would be nice, but IMHO it is not so important as having a solid and reasonably fast FEM core.
#3 would IMHO be the most useful for Ankit, and a nice addition to modelling capabilities of SfePy.
Other possible topics can be found in our issues list ("enhancement" label).
IMHO it would be good to prospective student(s) to try tackling some of the issues listed below to get acquainted with SfePy code before the GSoC starts:
#196 Document properly term evaluation modes and postprocessing/probing. #195 describe how to add Neumann BC in a diffusion example and tutorial (tutorial part done by Alec) #167 improve gallery page #164 Python 3 compatibility #154 automatic testing of terms #140 test schroedinger.py #133 Provide examples for SfePy Terms
Implementing the other enhancements would be, of course, also very useful, but those IMHO too difficult for someone trying to learn the code. They are certainly quite difficult for me, as they are not done yet =:) (shell elements!)
Cheers, r.
Awesome. :) :)
I am currently working on something. Will be posting here soon :)
On Wednesday, March 20, 2013 2:11:04 PM UTC+5:30, Robert Cimrman wrote:
Thanks!
SfePy is now listed among teams at [1]. The dates and deadlines are at the bottom of [2].
r.
[1] http://wiki.python.org/moin/SummerOfCode/2013 [2] https://www.google-melange.com/gsoc/events/google/gsoc2013
Hi everyone,
Here is the full text version of the Ideas as Robert requested:
#1 Parallelization
I went through the mailing list wherein it has been mentioned that SfePy can support multicore via numpy/scipy multicore support, but compute cluster support is not available which requires knowledge of MPI. So we can add computing cluster support where jobs need to communicate with each other and exploit the high performance computing in order to make it scalable. In Python it can be done using mpi4py module.
#2 Pre-processing and Post-Processing combined with SfePy
One of the reason why people use proprietary software is their ease to use. We can build complete simulation platform with powerful frontend for pre-processing to analysis to post-processing. This will include the
support wherein people can write scripts to post process the data on the platform itself. Also it will be made modular so as to make it extensible. We can look into integrating it with CAD packages like PythonCAD or built our own pre-processor or postprocessor using powerful GUI toolkit to provide the complete simulation solution.
#3 incorporating coupled equations for phase changing materials
Incorporating Phase changing material simulation which has never been done in any simulation software package before. Since it is my research area I will develop its model based on various research paper and my own research and encorporate it into SfePy. Regards, Ankit Mahato
On Tuesday, 19 March 2013 16:32:22 UTC+5:30, Robert Cimrman wrote:
Hi,
we are trying to sign up under the PSF umbrella for this year's Google Summer of Code because of an e-mail from Ankit Mahato, who expressed interest
to
help developing SfePy as his GSoC project this summer.
So let us discuss possible project ideas here. I will post results of
discussion to [1].
Ankit's ideas are (my summary):
#1 parallelization - cluster support using mpi4py #2 pre- and post-processing GUI frontend #3 incorporating phase changing materials (his research area)
Ankit, could you post full text your ideas into this thread? The pdf you sent me does not allow selecting text.
My comments:
For me, #1 is something that I was planning to do "soon" anyway as I am going to need it for my research work - a help would come really handy, but we will have to think carefully about the implementation. I think I prefer having a parallel layer above the current serial FEM, so that the current code can stay as it is, unaware that it runs in parallel. I am not sure yet how difficult it is going to be, but it won't be trivial.
#2 would be nice, but IMHO it is not so important as having a solid and reasonably fast FEM core.
#3 would IMHO be the most useful for Ankit, and a nice addition to modelling capabilities of SfePy.
Other possible topics can be found in our issues list ("enhancement" label).
IMHO it would be good to prospective student(s) to try tackling some of the issues listed below to get acquainted with SfePy code before the GSoC starts:
#196 Document properly term evaluation modes and
On 03/19/2013 12:17 PM, Ankit Mahato wrote: script the postprocessing/probing.
#195 describe how to add Neumann BC in a diffusion example and tutorial (tutorial part done by Alec) #167 improve gallery page #164 Python 3 compatibility #154 automatic testing of terms #140 test schroedinger.py #133 Provide examples for SfePy Terms
Implementing the other enhancements would be, of course, also very useful, but those IMHO too difficult for someone trying to learn the code. They are certainly quite difficult for me, as they are not done yet =:) (shell elements!)
Cheers, r.
Of course, if there is anybody else willing to try getting paid by Google for work on SfePy, do not hesitate and let us know.
r.
On 03/20/2013 11:19 AM, Ankit Mahato wrote:
Awesome. :) :)
I am currently working on something. Will be posting here soon :)
On Wednesday, March 20, 2013 2:11:04 PM UTC+5:30, Robert Cimrman wrote:
Thanks!
SfePy is now listed among teams at [1]. The dates and deadlines are at the bottom of [2].
r.
[1] http://wiki.python.org/moin/SummerOfCode/2013 [2] https://www.google-melange.com/gsoc/events/google/gsoc2013
Hi everyone,
Here is the full text version of the Ideas as Robert requested:
#1 Parallelization
I went through the mailing list wherein it has been mentioned that SfePy can support multicore via numpy/scipy multicore support, but compute cluster support is not available which requires knowledge of MPI. So we can add computing cluster support where jobs need to communicate with each other and exploit the high performance computing in order to make it scalable. In Python it can be done using mpi4py module.
#2 Pre-processing and Post-Processing combined with SfePy
One of the reason why people use proprietary software is their ease to use. We can build complete simulation platform with powerful frontend for pre-processing to analysis to post-processing. This will include the
support wherein people can write scripts to post process the data on the platform itself. Also it will be made modular so as to make it extensible. We can look into integrating it with CAD packages like PythonCAD or built our own pre-processor or postprocessor using powerful GUI toolkit to provide the complete simulation solution.
#3 incorporating coupled equations for phase changing materials
Incorporating Phase changing material simulation which has never been done in any simulation software package before. Since it is my research area I will develop its model based on various research paper and my own research and encorporate it into SfePy. Regards, Ankit Mahato
On Tuesday, 19 March 2013 16:32:22 UTC+5:30, Robert Cimrman wrote:
Hi,
we are trying to sign up under the PSF umbrella for this year's Google Summer of Code because of an e-mail from Ankit Mahato, who expressed interest
to
help developing SfePy as his GSoC project this summer.
So let us discuss possible project ideas here. I will post results of
discussion to [1].
Ankit's ideas are (my summary):
#1 parallelization - cluster support using mpi4py #2 pre- and post-processing GUI frontend #3 incorporating phase changing materials (his research area)
Ankit, could you post full text your ideas into this thread? The pdf you sent me does not allow selecting text.
My comments:
For me, #1 is something that I was planning to do "soon" anyway as I am going to need it for my research work - a help would come really handy, but we will have to think carefully about the implementation. I think I prefer having a parallel layer above the current serial FEM, so that the current code can stay as it is, unaware that it runs in parallel. I am not sure yet how difficult it is going to be, but it won't be trivial.
#2 would be nice, but IMHO it is not so important as having a solid and reasonably fast FEM core.
#3 would IMHO be the most useful for Ankit, and a nice addition to modelling capabilities of SfePy.
Other possible topics can be found in our issues list ("enhancement" label).
IMHO it would be good to prospective student(s) to try tackling some of the issues listed below to get acquainted with SfePy code before the GSoC starts:
#196 Document properly term evaluation modes and
On 03/19/2013 12:17 PM, Ankit Mahato wrote: script the postprocessing/probing.
#195 describe how to add Neumann BC in a diffusion example and tutorial (tutorial part done by Alec) #167 improve gallery page #164 Python 3 compatibility #154 automatic testing of terms #140 test schroedinger.py #133 Provide examples for SfePy Terms
Implementing the other enhancements would be, of course, also very useful, but those IMHO too difficult for someone trying to learn the code. They are certainly quite difficult for me, as they are not done yet =:) (shell elements!)
Cheers, r.
Idea #4:
Extending SfePy to encompass FEVA - Finite Element Vibration Analysis [1]
On Thursday, 21 March 2013 15:38:09 UTC+5:30, Robert Cimrman wrote:
Of course, if there is anybody else willing to try getting paid by Google for work on SfePy, do not hesitate and let us know.
r.
Awesome. :) :)
I am currently working on something. Will be posting here soon :)
On Wednesday, March 20, 2013 2:11:04 PM UTC+5:30, Robert Cimrman wrote:
Thanks!
SfePy is now listed among teams at [1]. The dates and deadlines are at
bottom of [2].
r.
[1] http://wiki.python.org/moin/SummerOfCode/2013 [2] https://www.google-melange.com/gsoc/events/google/gsoc2013
Hi everyone,
Here is the full text version of the Ideas as Robert requested:
#1 Parallelization
I went through the mailing list wherein it has been mentioned that SfePy can support multicore via numpy/scipy multicore support, but compute cluster support is not available which requires knowledge of MPI. So we can add computing cluster support where jobs need to communicate with each other and exploit the high performance computing in order to make it scalable. In Python it can be done using mpi4py module.
#2 Pre-processing and Post-Processing combined with SfePy
One of the reason why people use proprietary software is their ease to use. We can build complete simulation platform with powerful frontend for pre-processing to analysis to post-processing. This will include the
On 03/19/2013 12:17 PM, Ankit Mahato wrote: script
support wherein people can write scripts to post process the data on
On 03/20/2013 11:19 AM, Ankit Mahato wrote: the the
platform itself. Also it will be made modular so as to make it extensible. We can look into integrating it with CAD packages like PythonCAD or built our own pre-processor or postprocessor using powerful GUI toolkit to provide the complete simulation solution.
#3 incorporating coupled equations for phase changing materials
Incorporating Phase changing material simulation which has never been done in any simulation software package before. Since it is my research area I will develop its model based on various research paper and my own research and encorporate it into SfePy. Regards, Ankit Mahato
On Tuesday, 19 March 2013 16:32:22 UTC+5:30, Robert Cimrman wrote:
Hi,
we are trying to sign up under the PSF umbrella for this year's
Summer of Code because of an e-mail from Ankit Mahato, who expressed interest to help developing SfePy as his GSoC project this summer.
So let us discuss possible project ideas here. I will post results of
discussion to [1].
Ankit's ideas are (my summary):
#1 parallelization - cluster support using mpi4py #2 pre- and post-processing GUI frontend #3 incorporating phase changing materials (his research area)
Ankit, could you post full text your ideas into this thread? The pdf you sent me does not allow selecting text.
My comments:
For me, #1 is something that I was planning to do "soon" anyway as I am going to need it for my research work - a help would come really handy, but we will have to think carefully about the implementation. I think I prefer having a parallel layer above the current serial FEM, so that the current code can stay as it is, unaware that it runs in parallel. I am not sure yet how difficult it is going to be, but it won't be trivial.
#2 would be nice, but IMHO it is not so important as having a solid and reasonably fast FEM core.
#3 would IMHO be the most useful for Ankit, and a nice addition to modelling capabilities of SfePy.
Other possible topics can be found in our issues list ("enhancement" label).
IMHO it would be good to prospective student(s) to try tackling some of the issues listed below to get acquainted with SfePy code before the GSoC starts:
#196 Document properly term evaluation modes and
Google the postprocessing/probing.
#195 describe how to add Neumann BC in a diffusion example and tutorial (tutorial part done by Alec) #167 improve gallery page #164 Python 3 compatibility #154 automatic testing of terms #140 test schroedinger.py #133 Provide examples for SfePy Terms
Implementing the other enhancements would be, of course, also very useful, but those IMHO too difficult for someone trying to learn the code. They are certainly quite difficult for me, as they are not done yet =:) (shell elements!)
Cheers, r.
On 04/03/2013 02:47 PM, Ankit Mahato wrote:
Idea #4:
Extending SfePy to encompass FEVA - Finite Element Vibration Analysis [1]
You mean adding modal analysis capabilities for solids? The book has quite a broad range...
I have set up a wiki page [w] for this list of ideas, as looking them up in this thread might become difficult. I have also added you to our contributors team at github, hoping it's enough to get you wiki edit rights. Let me know if it is not so - we have not use the wiki for quite some time...
Cheers, r.
[w] https://github.com/sfepy/sfepy/wiki
On Thursday, 21 March 2013 15:38:09 UTC+5:30, Robert Cimrman wrote:
Of course, if there is anybody else willing to try getting paid by Google for work on SfePy, do not hesitate and let us know.
r.
Awesome. :) :)
I am currently working on something. Will be posting here soon :)
On Wednesday, March 20, 2013 2:11:04 PM UTC+5:30, Robert Cimrman wrote:
Thanks!
SfePy is now listed among teams at [1]. The dates and deadlines are at
bottom of [2].
r.
[1] http://wiki.python.org/moin/SummerOfCode/2013 [2] https://www.google-melange.com/gsoc/events/google/gsoc2013
Hi everyone,
Here is the full text version of the Ideas as Robert requested:
#1 Parallelization
I went through the mailing list wherein it has been mentioned that SfePy can support multicore via numpy/scipy multicore support, but compute cluster support is not available which requires knowledge of MPI. So we can add computing cluster support where jobs need to communicate with each other and exploit the high performance computing in order to make it scalable. In Python it can be done using mpi4py module.
#2 Pre-processing and Post-Processing combined with SfePy
One of the reason why people use proprietary software is their ease to use. We can build complete simulation platform with powerful frontend for pre-processing to analysis to post-processing. This will include the
On 03/19/2013 12:17 PM, Ankit Mahato wrote: script
support wherein people can write scripts to post process the data on
On 03/20/2013 11:19 AM, Ankit Mahato wrote: the the
platform itself. Also it will be made modular so as to make it extensible. We can look into integrating it with CAD packages like PythonCAD or built our own pre-processor or postprocessor using powerful GUI toolkit to provide the complete simulation solution.
#3 incorporating coupled equations for phase changing materials
Incorporating Phase changing material simulation which has never been done in any simulation software package before. Since it is my research area I will develop its model based on various research paper and my own research and encorporate it into SfePy. Regards, Ankit Mahato
On Tuesday, 19 March 2013 16:32:22 UTC+5:30, Robert Cimrman wrote:
Hi,
we are trying to sign up under the PSF umbrella for this year's
Summer of Code because of an e-mail from Ankit Mahato, who expressed interest to help developing SfePy as his GSoC project this summer.
So let us discuss possible project ideas here. I will post results of
discussion to [1].
Ankit's ideas are (my summary):
#1 parallelization - cluster support using mpi4py #2 pre- and post-processing GUI frontend #3 incorporating phase changing materials (his research area)
Ankit, could you post full text your ideas into this thread? The pdf you sent me does not allow selecting text.
My comments:
For me, #1 is something that I was planning to do "soon" anyway as I am going to need it for my research work - a help would come really handy, but we will have to think carefully about the implementation. I think I prefer having a parallel layer above the current serial FEM, so that the current code can stay as it is, unaware that it runs in parallel. I am not sure yet how difficult it is going to be, but it won't be trivial.
#2 would be nice, but IMHO it is not so important as having a solid and reasonably fast FEM core.
#3 would IMHO be the most useful for Ankit, and a nice addition to modelling capabilities of SfePy.
Other possible topics can be found in our issues list ("enhancement" label).
IMHO it would be good to prospective student(s) to try tackling some of the issues listed below to get acquainted with SfePy code before the GSoC starts:
#196 Document properly term evaluation modes and
Google the postprocessing/probing.
#195 describe how to add Neumann BC in a diffusion example and tutorial (tutorial part done by Alec) #167 improve gallery page #164 Python 3 compatibility #154 automatic testing of terms #140 test schroedinger.py #133 Provide examples for SfePy Terms
Implementing the other enhancements would be, of course, also very useful, but those IMHO too difficult for someone trying to learn the code. They are certainly quite difficult for me, as they are not done yet =:) (shell elements!)
Cheers, r.
yes robert i got hold of the book yesterday and it covers a wide range. I will edit the wiki with ideas.
Regards, Ankit
On Wednesday, 3 April 2013 18:38:27 UTC+5:30, Robert Cimrman wrote:
On 04/03/2013 02:47 PM, Ankit Mahato wrote:
Idea #4:
Extending SfePy to encompass FEVA - Finite Element Vibration Analysis [1]
[1]
You mean adding modal analysis capabilities for solids? The book has quite a broad range...
I have set up a wiki page [w] for this list of ideas, as looking them up in this thread might become difficult. I have also added you to our contributors team at github, hoping it's enough to get you wiki edit rights. Let me know if it is not so - we have not use the wiki for quite some time...
Cheers, r.
[w] https://github.com/sfepy/sfepy/wiki
On Thursday, 21 March 2013 15:38:09 UTC+5:30, Robert Cimrman wrote:
Of course, if there is anybody else willing to try getting paid by
for work on SfePy, do not hesitate and let us know.
r.
Awesome. :) :)
I am currently working on something. Will be posting here soon :)
On Wednesday, March 20, 2013 2:11:04 PM UTC+5:30, Robert Cimrman wrote:
Thanks!
SfePy is now listed among teams at [1]. The dates and deadlines are
at
bottom of [2].
r.
[1] http://wiki.python.org/moin/SummerOfCode/2013 [2] https://www.google-melange.com/gsoc/events/google/gsoc2013
Hi everyone,
Here is the full text version of the Ideas as Robert requested:
#1 Parallelization
I went through the mailing list wherein it has been mentioned that SfePy can support multicore via numpy/scipy multicore support, but compute cluster support is not available which requires knowledge of MPI. So we can add computing cluster support where jobs need to communicate with each other and exploit the high performance computing in order to make it scalable. In Python it can be done using mpi4py module.
#2 Pre-processing and Post-Processing combined with SfePy
One of the reason why people use proprietary software is their ease to use. We can build complete simulation platform with powerful frontend for pre-processing to analysis to post-processing. This will include the
On 03/19/2013 12:17 PM, Ankit Mahato wrote: script
support wherein people can write scripts to post process the data on
On 03/20/2013 11:19 AM, Ankit Mahato wrote: the the
platform itself. Also it will be made modular so as to make it extensible. We can look into integrating it with CAD packages like PythonCAD or built our own pre-processor or postprocessor using powerful GUI toolkit to provide the complete simulation solution.
#3 incorporating coupled equations for phase changing materials
Incorporating Phase changing material simulation which has never been done in any simulation software package before. Since it is my research area I will develop its model based on various research paper and my own research and encorporate it into SfePy. Regards, Ankit Mahato
On Tuesday, 19 March 2013 16:32:22 UTC+5:30, Robert Cimrman wrote: > > Hi, > > we are trying to sign up under the PSF umbrella for this year's Google > Summer > of Code because of an e-mail from Ankit Mahato, who expressed interest to > help > developing SfePy as his GSoC project this summer. > > So let us discuss possible project ideas here. I will post results of the > discussion to [1]. > > Ankit's ideas are (my summary): > > #1 parallelization - cluster support using mpi4py > #2 pre- and post-processing GUI frontend > #3 incorporating phase changing materials (his research area) > > Ankit, could you post full text your ideas into this thread? The
Google pdf
> sent > me does not allow selecting text. > > My comments: > > For me, #1 is something that I was planning to do "soon" anyway as I am > going > to need it for my research work - a help would come really handy, but we > will > have to think carefully about the implementation. I think I prefer having > a > parallel layer above the current serial FEM, so that the current code can > stay > as it is, unaware that it runs in parallel. I am not sure yet how > difficult it > is going to be, but it won't be trivial. > > #2 would be nice, but IMHO it is not so important as having a solid and > reasonably fast FEM core. > > #3 would IMHO be the most useful for Ankit, and a nice addition to > modelling > capabilities of SfePy. > > Other possible topics can be found in our issues list ("enhancement" > label). > > IMHO it would be good to prospective student(s) to try tackling some of > the > issues listed below to get acquainted with SfePy code before the GSoC > starts: > > #196 Document properly term evaluation modes and
you postprocessing/probing.
> #195 describe how to add Neumann BC in a diffusion example and tutorial > (tutorial part done by Alec) > #167 improve gallery page > #164 Python 3 compatibility > #154 automatic testing of terms > #140 test schroedinger.py > #133 Provide examples for SfePy Terms > > Implementing the other enhancements would be, of course, also very useful, > but > those IMHO too difficult for someone trying to learn the code. They are > certainly quite difficult for me, as they are not done yet =:) (shell > elements!) > > Cheers, > r. > > [1] http://sfepy.org/doc-devel/development.html
Hi R,
This is the path I have come up with for my GSoC idea implementation: Convection-Diffusion (Steady) Convection-Diffusion (Unsteady) Convection-Diffusion coupled with phase change Meso-Scale Dendritic growth (with convection,diffusion,phase change)
Kindly lend your view.
Regards, Ankit
On Wednesday, 3 April 2013 18:58:02 UTC+5:30, Ankit Mahato wrote:
yes robert i got hold of the book yesterday and it covers a wide range. I will edit the wiki with ideas.
Regards, Ankit
On Wednesday, 3 April 2013 18:38:27 UTC+5:30, Robert Cimrman wrote:
On 04/03/2013 02:47 PM, Ankit Mahato wrote:
Idea #4:
Extending SfePy to encompass FEVA - Finite Element Vibration Analysis [1]
[1]
You mean adding modal analysis capabilities for solids? The book has quite a broad range...
I have set up a wiki page [w] for this list of ideas, as looking them up in this thread might become difficult. I have also added you to our contributors team at github, hoping it's enough to get you wiki edit rights. Let me know if it is not so - we have not use the wiki for quite some time...
Cheers, r.
[w] https://github.com/sfepy/sfepy/wiki
On Thursday, 21 March 2013 15:38:09 UTC+5:30, Robert Cimrman wrote:
Of course, if there is anybody else willing to try getting paid by
for work on SfePy, do not hesitate and let us know.
r.
Awesome. :) :)
I am currently working on something. Will be posting here soon :)
On Wednesday, March 20, 2013 2:11:04 PM UTC+5:30, Robert Cimrman wrote:
Thanks!
SfePy is now listed among teams at [1]. The dates and deadlines are
at
On 03/20/2013 11:19 AM, Ankit Mahato wrote: the
bottom of [2].
r.
[1] http://wiki.python.org/moin/SummerOfCode/2013 [2] https://www.google-melange.com/gsoc/events/google/gsoc2013
On 03/19/2013 12:17 PM, Ankit Mahato wrote: > Hi everyone, > > Here is the full text version of the Ideas as Robert requested: > > #1 Parallelization > > I went through the mailing list wherein it has been mentioned that SfePy > can support multicore via numpy/scipy multicore support, but compute > cluster support is not available which requires knowledge of MPI. So we can > add computing cluster support where jobs need to communicate with each > other and exploit the high performance computing in order to make it > scalable. In Python it can be done using mpi4py module. > > > > #2 Pre-processing and Post-Processing combined with SfePy > > One of the reason why people use proprietary software is their ease to use. > We can build complete simulation platform with powerful frontend for > pre-processing to analysis to post-processing. This will include
script > support wherein people can write scripts to post process the data on the > platform itself. Also it will be made modular so as to make it extensible. > We can look into integrating it with CAD packages like PythonCAD or built > our own pre-processor or postprocessor using powerful GUI toolkit to > provide the complete simulation solution. > > #3 incorporating coupled equations for phase changing materials > > Incorporating Phase changing material simulation which has never been done > in any simulation software package before. Since it is my research area I > will develop its model based on various research paper and my own research > and encorporate it into SfePy. > Regards, > Ankit Mahato > > On Tuesday, 19 March 2013 16:32:22 UTC+5:30, Robert Cimrman wrote: >> >> Hi, >> >> we are trying to sign up under the PSF umbrella for this year's Google >> Summer >> of Code because of an e-mail from Ankit Mahato, who expressed interest to >> help >> developing SfePy as his GSoC project this summer. >> >> So let us discuss possible project ideas here. I will post results of the >> discussion to [1]. >> >> Ankit's ideas are (my summary): >> >> #1 parallelization - cluster support using mpi4py >> #2 pre- and post-processing GUI frontend >> #3 incorporating phase changing materials (his research area) >> >> Ankit, could you post full text your ideas into this thread? The
Google the pdf
you >> sent >> me does not allow selecting text. >> >> My comments: >> >> For me, #1 is something that I was planning to do "soon" anyway as I am >> going >> to need it for my research work - a help would come really handy, but we >> will >> have to think carefully about the implementation. I think I prefer having >> a >> parallel layer above the current serial FEM, so that the current code can >> stay >> as it is, unaware that it runs in parallel. I am not sure yet how >> difficult it >> is going to be, but it won't be trivial. >> >> #2 would be nice, but IMHO it is not so important as having a solid and >> reasonably fast FEM core. >> >> #3 would IMHO be the most useful for Ankit, and a nice addition to >> modelling >> capabilities of SfePy. >> >> Other possible topics can be found in our issues list ("enhancement" >> label). >> >> IMHO it would be good to prospective student(s) to try tackling some of >> the >> issues listed below to get acquainted with SfePy code before the GSoC >> starts: >> >> #196 Document properly term evaluation modes and postprocessing/probing. >> #195 describe how to add Neumann BC in a diffusion example and tutorial >> (tutorial part done by Alec) >> #167 improve gallery page >> #164 Python 3 compatibility >> #154 automatic testing of terms >> #140 test schroedinger.py >> #133 Provide examples for SfePy Terms >> >> Implementing the other enhancements would be, of course, also very useful, >> but >> those IMHO too difficult for someone trying to learn the code. They are >> certainly quite difficult for me, as they are not done yet =:) (shell >> elements!) >> >> Cheers, >> r. >> >> [1] http://sfepy.org/doc-devel/development.html
Hi Ankit,
On 04/19/2013 11:37 PM, Ankit Mahato wrote:
Hi R,
This is the path I have come up with for my GSoC idea implementation: Convection-Diffusion (Steady) Convection-Diffusion (Unsteady) Convection-Diffusion coupled with phase change Meso-Scale Dendritic growth (with convection,diffusion,phase change)
Kindly lend your view.
Looks ok (saying that with zero knowledge of the latter topics :)) The zeroth step would be a weak formulation of all the equations - what kinds of FE/function spaces are you going to need? Will there be issues with inf-sup (Babuska-Brezzi) condition like in incompressible flow, or other numeracal problems to be tackled?
r.
Is anybody pursuing parallelization at this point? Any hints about how that might be done? I'd be interested in contributing. I've got some experience with pypar (openmpi python wrapper), but not sure how sfepy would need to be modified to make that work.
-steve
On Tuesday, March 19, 2013 5:17:49 AM UTC-6, Ankit Mahato wrote:
Hi everyone,
Here is the full text version of the Ideas as Robert requested:
#1 Parallelization
I went through the mailing list wherein it has been mentioned that SfePy can support multicore via numpy/scipy multicore support, but compute cluster support is not available which requires knowledge of MPI. So we can add computing cluster support where jobs need to communicate with each other and exploit the high performance computing in order to make it scalable. In Python it can be done using mpi4py module.
On 06/06/2013 01:09 AM, steve wrote:
Is anybody pursuing parallelization at this point? Any hints about how that might be done? I'd be interested in contributing. I've got some experience with pypar (openmpi python wrapper), but not sure how sfepy would need to be modified to make that work.
We agreed with Ankit (the GSoC student), that both devise and implement the parallelization would be too much in the three or so months GSoC supports, and too risky, so I think nobody is pursuing that right now.
I have only some vague ideas, like that it should be done at a high level, so that the FE assembling code does not see it. So any help, mostly theoretical in the current state, would be appreciated!
r.
-steve
On Tuesday, March 19, 2013 5:17:49 AM UTC-6, Ankit Mahato wrote:
Hi everyone,
Here is the full text version of the Ideas as Robert requested:
#1 Parallelization
I went through the mailing list wherein it has been mentioned that SfePy can support multicore via numpy/scipy multicore support, but compute cluster support is not available which requires knowledge of MPI. So we can add computing cluster support where jobs need to communicate with each other and exploit the high performance computing in order to make it scalable. In Python it can be done using mpi4py module.
On Jun 6, 2013, at 2:38 AM, Robert Cimrman cimr...@ntc.zcu.cz wrote:
On 06/06/2013 01:09 AM, steve wrote:
Is anybody pursuing parallelization at this point? Any hints about how that might be done? I'd be interested in contributing. I've got some experience with pypar (openmpi python wrapper), but not sure how sfepy would need to be modified to make that work.
We agreed with Ankit (the GSoC student), that both devise and implement the parallelization would be too much in the three or so months GSoC supports, and too risky, so I think nobody is pursuing that right now.
I have only some vague ideas, like that it should be done at a high level, so that the FE assembling code does not see it. So any help, mostly theoretical in the current state, would be appreciated!
r.
So.. from my brief perusal it seems like the easiest approach might be something like that taken in FiPy. It looks like they switch between pysparse, scipy and trilinos depending on the situation and whether they want to use MPI. I have no idea how hard that would be to actually implement. ;-) To what degree does the FE assembling code depend explicitly on scipy?
this is from the FiPy "solvers" page: http://www.ctcms.nist.gov/fipy/documentation/SOLVERS.html
FiPy requires either PySparse, SciPy or Trilinos to be installed in order to solve linear systems. From our experiences,FiPy runs most efficiently in serial when PySparse is the linear solver. Trilinos is the most complete of the three solvers due to its numerous preconditioning and solver capabilities and it also allows FiPy to run in parallel.
-steve
On 06/06/2013 05:41 PM, Steve Spicklemire wrote:
On Jun 6, 2013, at 2:38 AM, Robert Cimrman cimr...@ntc.zcu.cz wrote:
On 06/06/2013 01:09 AM, steve wrote:
Is anybody pursuing parallelization at this point? Any hints about how that might be done? I'd be interested in contributing. I've got some experience with pypar (openmpi python wrapper), but not sure how sfepy would need to be modified to make that work.
We agreed with Ankit (the GSoC student), that both devise and implement the parallelization would be too much in the three or so months GSoC supports, and too risky, so I think nobody is pursuing that right now.
I have only some vague ideas, like that it should be done at a high level, so that the FE assembling code does not see it. So any help, mostly theoretical in the current state, would be appreciated!
r.
So.. from my brief perusal it seems like the easiest approach might be something like that taken in FiPy. It looks like they switch between pysparse, scipy and trilinos depending on the situation and whether they want to use MPI. I have no idea how hard that would be to actually implement. ;-) To what degree does the FE assembling code depend explicitly on scipy?
Well, we can "sort of use" a parallel linear solver right now (see PETScParallelKrylovSolver in [1]), but that cannot be efficient unless the matrix is assembled in parallel (all terms evaluated in their respective parallel subdomains). The "sort of use" means, that the matrix is assembled in a single process, and then a parallel solution is launched. It can get us something like 2.5x speed-up at 4 cores, so it's not that bad, but cannot get us further...
Does FiPy has "parallel assembling"? I know it's finite volumes, but it's essentially the same as finite elements in this respect - one has a grid/mesh, that needs to be distributed among processors.
r.
[1] http://sfepy.org/doc-devel/src/sfepy/solvers/ls.html
this is from the FiPy "solvers" page: http://www.ctcms.nist.gov/fipy/documentation/SOLVERS.html
FiPy requires either PySparse, SciPy or Trilinos to be installed in order to solve linear systems. From our experiences,FiPy runs most efficiently in serial when PySparse is the linear solver. Trilinos is the most complete of the three solvers due to its numerous preconditioning and solver capabilities and it also allows FiPy to run in parallel.
-steve
Unfortunately I have said everything I know about FiPy (what's on the web page). I've never used it. ;-(
I'll try to dig more deeply and report back.
-steve
On Jun 6, 2013, at 11:10 AM, Robert Cimrman cimr...@ntc.zcu.cz wrote:
On 06/06/2013 05:41 PM, Steve Spicklemire wrote:
On Jun 6, 2013, at 2:38 AM, Robert Cimrman cimr...@ntc.zcu.cz wrote:
On 06/06/2013 01:09 AM, steve wrote:
Is anybody pursuing parallelization at this point? Any hints about how that might be done? I'd be interested in contributing. I've got some experience with pypar (openmpi python wrapper), but not sure how sfepy would need to be modified to make that work.
We agreed with Ankit (the GSoC student), that both devise and implement the parallelization would be too much in the three or so months GSoC supports, and too risky, so I think nobody is pursuing that right now.
I have only some vague ideas, like that it should be done at a high level, so that the FE assembling code does not see it. So any help, mostly theoretical in the current state, would be appreciated!
r.
So.. from my brief perusal it seems like the easiest approach might be something like that taken in FiPy. It looks like they switch between pysparse, scipy and trilinos depending on the situation and whether they want to use MPI. I have no idea how hard that would be to actually implement. ;-) To what degree does the FE assembling code depend explicitly on scipy?
Well, we can "sort of use" a parallel linear solver right now (see PETScParallelKrylovSolver in [1]), but that cannot be efficient unless the matrix is assembled in parallel (all terms evaluated in their respective parallel subdomains). The "sort of use" means, that the matrix is assembled in a single process, and then a parallel solution is launched. It can get us something like 2.5x speed-up at 4 cores, so it's not that bad, but cannot get us further...
Does FiPy has "parallel assembling"? I know it's finite volumes, but it's essentially the same as finite elements in this respect - one has a grid/mesh, that needs to be distributed among processors.
r.
[1] http://sfepy.org/doc-devel/src/sfepy/solvers/ls.html
this is from the FiPy "solvers" page: http://www.ctcms.nist.gov/fipy/documentation/SOLVERS.html
FiPy requires either PySparse, SciPy or Trilinos to be installed in order to solve linear systems. From our experiences,FiPy runs most efficiently in serial when PySparse is the linear solver. Trilinos is the most complete of the three solvers due to its numerous preconditioning and solver capabilities and it also allows FiPy to run in parallel.
-steve
-- You received this message because you are subscribed to the Google Groups "sfepy-devel" group. To unsubscribe from this group and stop receiving emails from it, send an email to sfepy-devel...@googlegroups.com. To post to this group, send email to sfepy...@googlegroups.com. Visit this group at http://groups.google.com/group/sfepy-devel?hl=en.
participants (4)
-
Ankit Mahato
-
Robert Cimrman
-
steve
-
Steve Spicklemire