Available Count
int64 1
31
| AnswerCount
int64 1
35
| GUI and Desktop Applications
int64 0
1
| Users Score
int64 -17
588
| Q_Score
int64 0
6.79k
| Python Basics and Environment
int64 0
1
| Score
float64 -1
1.2
| Networking and APIs
int64 0
1
| Question
stringlengths 15
7.24k
| Database and SQL
int64 0
1
| Tags
stringlengths 6
76
| CreationDate
stringlengths 23
23
| System Administration and DevOps
int64 0
1
| Q_Id
int64 469
38.2M
| Answer
stringlengths 15
7k
| Data Science and Machine Learning
int64 0
1
| ViewCount
int64 13
1.88M
| is_accepted
bool 2
classes | Web Development
int64 0
1
| Other
int64 1
1
| Title
stringlengths 15
142
| A_Id
int64 518
72.2M
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
4 | 15 | 0 | 4 | 236 | 1 | 0.053283 | 0 | I've been wondering this for some time. As the title say, which is faster, the actual function or simply raising to the half power?
UPDATE
This is not a matter of premature optimization. This is simply a question of how the underlying code actually works. What is the theory of how Python code works?
I sent Guido van Rossum an email cause I really wanted to know the differences in these methods.
My email:
There are at least 3 ways to do a square root in Python: math.sqrt, the
'**' operator and pow(x,.5). I'm just curious as to the differences in
the implementation of each of these. When it comes to efficiency which
is better?
His response:
pow and ** are equivalent; math.sqrt doesn't work for complex numbers,
and links to the C sqrt() function. As to which one is
faster, I have no idea... | 0 | python,performance | 2008-11-29T01:24:00.000 | 0 | 327,002 | using Claudiu's code, on my machine even with "from math import sqrt" x**.5 is faster but using psyco.full() sqrt(x) becomes much faster, at least by 200% | 0 | 100,568 | false | 0 | 1 | Which is faster in Python: x**.5 or math.sqrt(x)? | 327,063 |
4 | 15 | 0 | 3 | 236 | 1 | 0.039979 | 0 | I've been wondering this for some time. As the title say, which is faster, the actual function or simply raising to the half power?
UPDATE
This is not a matter of premature optimization. This is simply a question of how the underlying code actually works. What is the theory of how Python code works?
I sent Guido van Rossum an email cause I really wanted to know the differences in these methods.
My email:
There are at least 3 ways to do a square root in Python: math.sqrt, the
'**' operator and pow(x,.5). I'm just curious as to the differences in
the implementation of each of these. When it comes to efficiency which
is better?
His response:
pow and ** are equivalent; math.sqrt doesn't work for complex numbers,
and links to the C sqrt() function. As to which one is
faster, I have no idea... | 0 | python,performance | 2008-11-29T01:24:00.000 | 0 | 327,002 | Most likely math.sqrt(x), because it's optimized for square rooting.
Benchmarks will provide you the answer you are looking for. | 0 | 100,568 | false | 0 | 1 | Which is faster in Python: x**.5 or math.sqrt(x)? | 327,005 |
4 | 15 | 0 | -4 | 236 | 1 | -1 | 0 | I've been wondering this for some time. As the title say, which is faster, the actual function or simply raising to the half power?
UPDATE
This is not a matter of premature optimization. This is simply a question of how the underlying code actually works. What is the theory of how Python code works?
I sent Guido van Rossum an email cause I really wanted to know the differences in these methods.
My email:
There are at least 3 ways to do a square root in Python: math.sqrt, the
'**' operator and pow(x,.5). I'm just curious as to the differences in
the implementation of each of these. When it comes to efficiency which
is better?
His response:
pow and ** are equivalent; math.sqrt doesn't work for complex numbers,
and links to the C sqrt() function. As to which one is
faster, I have no idea... | 0 | python,performance | 2008-11-29T01:24:00.000 | 0 | 327,002 | What would be even faster is if you went into math.py and copied the function "sqrt" into your program. It takes time for your program to find math.py, then open it, find the function you are looking for, and then bring that back to your program. If that function is faster even with the "lookup" steps, then the function itself has to be awfully fast. Probably will cut your time in half. IN summary:
Go to math.py
Find the function "sqrt"
Copy it
Paste function into your program as the sqrt finder.
Time it. | 0 | 100,568 | false | 0 | 1 | Which is faster in Python: x**.5 or math.sqrt(x)? | 29,231,648 |
9 | 13 | 0 | 3 | 10 | 0 | 0.046121 | 0 | I have a small lightweight application that is used as part of a larger solution. Currently it is written in C but I am looking to rewrite it using a cross-platform scripting language. The solution needs to run on Windows, Linux, Solaris, AIX and HP-UX.
The existing C application works fine but I want to have a single script I can maintain for all platforms. At the same time, I do not want to lose a lot of performance but am willing to lose some.
Startup cost of the script is very important. This script can be called anywhere from every minute to many times per second. As a consequence, keeping it's memory and startup time low are important.
So basically I'm looking for the best scripting languages that is:
Cross platform.
Capable of XML parsing and HTTP Posts.
Low memory and low startup time.
Possible choices include but are not limited to: bash/ksh + curl, Perl, Python and Ruby. What would you recommend for this type of a scenario? | 0 | python,ruby,perl,bash,scripting-language | 2008-11-29T21:34:00.000 | 1 | 328,041 | If Low memory and low startup time are truly important you might want to consider doing the work to keep the C code cross platform, however I have found this is rarely necessary.
Personally I would use Ruby or Python for this type of job, they both make it very easy to make clear understandable code that others can maintain (or you can maintain after not looking at it for 6 months). If you have the control to do so I would also suggest getting the latest version of the interpreter, as both Ruby and Python have made notable improvements around performance recently.
It is a bit of a personal thing. Programming Ruby makes me happy, C code does not (nor bash scripting for anything non-trivial). | 0 | 8,004 | false | 0 | 1 | Scripting language choice for initial performance | 328,062 |
9 | 13 | 0 | 0 | 10 | 0 | 0 | 0 | I have a small lightweight application that is used as part of a larger solution. Currently it is written in C but I am looking to rewrite it using a cross-platform scripting language. The solution needs to run on Windows, Linux, Solaris, AIX and HP-UX.
The existing C application works fine but I want to have a single script I can maintain for all platforms. At the same time, I do not want to lose a lot of performance but am willing to lose some.
Startup cost of the script is very important. This script can be called anywhere from every minute to many times per second. As a consequence, keeping it's memory and startup time low are important.
So basically I'm looking for the best scripting languages that is:
Cross platform.
Capable of XML parsing and HTTP Posts.
Low memory and low startup time.
Possible choices include but are not limited to: bash/ksh + curl, Perl, Python and Ruby. What would you recommend for this type of a scenario? | 0 | python,ruby,perl,bash,scripting-language | 2008-11-29T21:34:00.000 | 1 | 328,041 | I agree with others in that you should probably try to make this a more portable C app instead of porting it over to something else since any scripting language is going to introduce significant overhead from a startup perspective, have a much larger memory footprint, and will probably be much slower.
In my experience, Python is the most efficient of the three, followed by Perl and then Ruby with the difference between Perl and Ruby being particularly large in certain areas. If you really want to try porting this to a scripting language, I would put together a prototype in the language you are most comfortable with and see if it comes close to your requirements. If you don't have a preference, start with Python as it is easy to learn and use and if it is too slow with Python, Perl and Ruby probably won't be able to do any better. | 0 | 8,004 | false | 0 | 1 | Scripting language choice for initial performance | 328,075 |
9 | 13 | 0 | 5 | 10 | 0 | 0.076772 | 0 | I have a small lightweight application that is used as part of a larger solution. Currently it is written in C but I am looking to rewrite it using a cross-platform scripting language. The solution needs to run on Windows, Linux, Solaris, AIX and HP-UX.
The existing C application works fine but I want to have a single script I can maintain for all platforms. At the same time, I do not want to lose a lot of performance but am willing to lose some.
Startup cost of the script is very important. This script can be called anywhere from every minute to many times per second. As a consequence, keeping it's memory and startup time low are important.
So basically I'm looking for the best scripting languages that is:
Cross platform.
Capable of XML parsing and HTTP Posts.
Low memory and low startup time.
Possible choices include but are not limited to: bash/ksh + curl, Perl, Python and Ruby. What would you recommend for this type of a scenario? | 0 | python,ruby,perl,bash,scripting-language | 2008-11-29T21:34:00.000 | 1 | 328,041 | When written properly, C should be platform independant and would only need a recompile for those different platforms. You might have to jump through some #ifdef hoops for the headers (not all systems use the same headers), but most normal (non-win32 API) calls are very portable.
For web access (which I presume you need as you mention bash+curl), you could take a look at libcurl, it's available for all the platforms you mentioned, and shouldn't be that hard to work with.
With execution time and memory cost in mind, I doubt you could go any faster than properly written C with any scripting language as you would lose at least some time on interpreting the script... | 0 | 8,004 | false | 0 | 1 | Scripting language choice for initial performance | 328,054 |
9 | 13 | 0 | 4 | 10 | 0 | 0.061461 | 0 | I have a small lightweight application that is used as part of a larger solution. Currently it is written in C but I am looking to rewrite it using a cross-platform scripting language. The solution needs to run on Windows, Linux, Solaris, AIX and HP-UX.
The existing C application works fine but I want to have a single script I can maintain for all platforms. At the same time, I do not want to lose a lot of performance but am willing to lose some.
Startup cost of the script is very important. This script can be called anywhere from every minute to many times per second. As a consequence, keeping it's memory and startup time low are important.
So basically I'm looking for the best scripting languages that is:
Cross platform.
Capable of XML parsing and HTTP Posts.
Low memory and low startup time.
Possible choices include but are not limited to: bash/ksh + curl, Perl, Python and Ruby. What would you recommend for this type of a scenario? | 0 | python,ruby,perl,bash,scripting-language | 2008-11-29T21:34:00.000 | 1 | 328,041 | I concur with Lua: it is super-portable, it has XML libraries, either native or by binding C libraries like Expat, it has a good socket library (LuaSocket) plus, for complex stuff, some cURL bindings, and is well known for being very lightweight (often embedded in low memory devices), very fast (one of the fastest scripting languages), and powerful. And very easy to code!
It is coded in pure Ansi C, and lot of people claim it has one of the best C biding API (calling C routines from Lua, calling Lua code from C...). | 0 | 8,004 | false | 0 | 1 | Scripting language choice for initial performance | 328,120 |
9 | 13 | 0 | 9 | 10 | 0 | 1.2 | 0 | I have a small lightweight application that is used as part of a larger solution. Currently it is written in C but I am looking to rewrite it using a cross-platform scripting language. The solution needs to run on Windows, Linux, Solaris, AIX and HP-UX.
The existing C application works fine but I want to have a single script I can maintain for all platforms. At the same time, I do not want to lose a lot of performance but am willing to lose some.
Startup cost of the script is very important. This script can be called anywhere from every minute to many times per second. As a consequence, keeping it's memory and startup time low are important.
So basically I'm looking for the best scripting languages that is:
Cross platform.
Capable of XML parsing and HTTP Posts.
Low memory and low startup time.
Possible choices include but are not limited to: bash/ksh + curl, Perl, Python and Ruby. What would you recommend for this type of a scenario? | 0 | python,ruby,perl,bash,scripting-language | 2008-11-29T21:34:00.000 | 1 | 328,041 | Because of your requirement for fast startup time and a calling frequency greater than 1Hz I'd recommend either staying with C and figuring out how to make it portable (not always as easy as a few ifdefs) or exploring the possibility of turning it into a service daemon that is always running. Of course this depends on how
Python can have lower startup times if you compile the module and run the .pyc file, but it is still generally considered slow. Perl, in my experience, in the fastest of the scripting languages so you might have good luck with a perl daemon.
You could also look at cross platform frameworks like gtk, wxWidgets and Qt. While they are targeted at GUIs they do have low level cross platform data types and network libraries that could make the job of using a fast C based application easier. | 0 | 8,004 | true | 0 | 1 | Scripting language choice for initial performance | 328,065 |
9 | 13 | 0 | 23 | 10 | 0 | 1 | 0 | I have a small lightweight application that is used as part of a larger solution. Currently it is written in C but I am looking to rewrite it using a cross-platform scripting language. The solution needs to run on Windows, Linux, Solaris, AIX and HP-UX.
The existing C application works fine but I want to have a single script I can maintain for all platforms. At the same time, I do not want to lose a lot of performance but am willing to lose some.
Startup cost of the script is very important. This script can be called anywhere from every minute to many times per second. As a consequence, keeping it's memory and startup time low are important.
So basically I'm looking for the best scripting languages that is:
Cross platform.
Capable of XML parsing and HTTP Posts.
Low memory and low startup time.
Possible choices include but are not limited to: bash/ksh + curl, Perl, Python and Ruby. What would you recommend for this type of a scenario? | 0 | python,ruby,perl,bash,scripting-language | 2008-11-29T21:34:00.000 | 1 | 328,041 | Lua is a scripting language that meets your criteria. It's certainly the fastest and lowest memory scripting language available. | 0 | 8,004 | false | 0 | 1 | Scripting language choice for initial performance | 328,045 |
9 | 13 | 0 | 0 | 10 | 0 | 0 | 0 | I have a small lightweight application that is used as part of a larger solution. Currently it is written in C but I am looking to rewrite it using a cross-platform scripting language. The solution needs to run on Windows, Linux, Solaris, AIX and HP-UX.
The existing C application works fine but I want to have a single script I can maintain for all platforms. At the same time, I do not want to lose a lot of performance but am willing to lose some.
Startup cost of the script is very important. This script can be called anywhere from every minute to many times per second. As a consequence, keeping it's memory and startup time low are important.
So basically I'm looking for the best scripting languages that is:
Cross platform.
Capable of XML parsing and HTTP Posts.
Low memory and low startup time.
Possible choices include but are not limited to: bash/ksh + curl, Perl, Python and Ruby. What would you recommend for this type of a scenario? | 0 | python,ruby,perl,bash,scripting-language | 2008-11-29T21:34:00.000 | 1 | 328,041 | Port your app to Ruby. If your app is too slow, profile it and rewrite the those parts in C. | 0 | 8,004 | false | 0 | 1 | Scripting language choice for initial performance | 328,519 |
9 | 13 | 0 | 6 | 10 | 0 | 1 | 0 | I have a small lightweight application that is used as part of a larger solution. Currently it is written in C but I am looking to rewrite it using a cross-platform scripting language. The solution needs to run on Windows, Linux, Solaris, AIX and HP-UX.
The existing C application works fine but I want to have a single script I can maintain for all platforms. At the same time, I do not want to lose a lot of performance but am willing to lose some.
Startup cost of the script is very important. This script can be called anywhere from every minute to many times per second. As a consequence, keeping it's memory and startup time low are important.
So basically I'm looking for the best scripting languages that is:
Cross platform.
Capable of XML parsing and HTTP Posts.
Low memory and low startup time.
Possible choices include but are not limited to: bash/ksh + curl, Perl, Python and Ruby. What would you recommend for this type of a scenario? | 0 | python,ruby,perl,bash,scripting-language | 2008-11-29T21:34:00.000 | 1 | 328,041 | "called anywhere from every minute to many times per second. As a consequence, keeping it's memory and startup time low are important."
This doesn't sound like a script to me at all.
This sounds like a server handling requests that arrive from every minute to several times a second.
If it's a server, handling requests, start-up time doesn't mean as much as responsiveness. In which case, Python might work out well, and still keep performance up.
Rather than restarting, you're just processing another request. You get to keep as much state as you need to optimize performance. | 0 | 8,004 | false | 0 | 1 | Scripting language choice for initial performance | 328,129 |
9 | 13 | 0 | 0 | 10 | 0 | 0 | 0 | I have a small lightweight application that is used as part of a larger solution. Currently it is written in C but I am looking to rewrite it using a cross-platform scripting language. The solution needs to run on Windows, Linux, Solaris, AIX and HP-UX.
The existing C application works fine but I want to have a single script I can maintain for all platforms. At the same time, I do not want to lose a lot of performance but am willing to lose some.
Startup cost of the script is very important. This script can be called anywhere from every minute to many times per second. As a consequence, keeping it's memory and startup time low are important.
So basically I'm looking for the best scripting languages that is:
Cross platform.
Capable of XML parsing and HTTP Posts.
Low memory and low startup time.
Possible choices include but are not limited to: bash/ksh + curl, Perl, Python and Ruby. What would you recommend for this type of a scenario? | 0 | python,ruby,perl,bash,scripting-language | 2008-11-29T21:34:00.000 | 1 | 328,041 | Can you instead have it be a long-running process and answer http or rpc requests?
This would satisfy the latency requirements in almost any scenario, but I don't know if that would break your memory footprint constraints. | 0 | 8,004 | false | 0 | 1 | Scripting language choice for initial performance | 328,132 |
3 | 8 | 1 | 4 | 56 | 1 | 0.099668 | 0 | I'm a long time C++/Java developer trying to get into Python and am looking for the stereotypical "Python for C++ Developers" article, but coming up blank. I've seen these sort of things for C#, Java, etc, and they're incredibly useful for getting up to speed on language features and noteworthy differences. Anyone have any references?
As a secondary bonus question, what open source Python program would you suggest looking at for clean design, commenting, and use of the language as a point of reference for study?
Thanks in advance. | 0 | c++,python | 2008-11-30T07:14:00.000 | 0 | 328,577 | I learned a lot about Python by reading the source of the standard library that ships with Python. I seem to remember having a few "a-ha!" moments when reading urllib2.py in particular. | 0 | 43,679 | false | 0 | 1 | Python for C++ Developers | 328,599 |
3 | 8 | 1 | 0 | 56 | 1 | 0 | 0 | I'm a long time C++/Java developer trying to get into Python and am looking for the stereotypical "Python for C++ Developers" article, but coming up blank. I've seen these sort of things for C#, Java, etc, and they're incredibly useful for getting up to speed on language features and noteworthy differences. Anyone have any references?
As a secondary bonus question, what open source Python program would you suggest looking at for clean design, commenting, and use of the language as a point of reference for study?
Thanks in advance. | 0 | c++,python | 2008-11-30T07:14:00.000 | 0 | 328,577 | For the best examples of code of a language, the language's standard library is often a good place to look. Pick a recent piece, though - old parts are probably written for older versions and also sometimes were written before the library became big enough to warrant big standards - like PHP and Erlang's libraries, which have internal inconsistency.
For Python in particular, Python 3000 is cleaning up the library a lot, and so is probably a great source of good Python code (though it is written for a future Python version). | 0 | 43,679 | false | 0 | 1 | Python for C++ Developers | 328,598 |
3 | 8 | 1 | 1 | 56 | 1 | 0.024995 | 0 | I'm a long time C++/Java developer trying to get into Python and am looking for the stereotypical "Python for C++ Developers" article, but coming up blank. I've seen these sort of things for C#, Java, etc, and they're incredibly useful for getting up to speed on language features and noteworthy differences. Anyone have any references?
As a secondary bonus question, what open source Python program would you suggest looking at for clean design, commenting, and use of the language as a point of reference for study?
Thanks in advance. | 0 | c++,python | 2008-11-30T07:14:00.000 | 0 | 328,577 | C# and Java are seen as cleaner replacements for C++ in many application areas so there is often a "migration" from one to the other - which is why there are books available.
Python and C++ are very different beasts, and although they are both considered general purpose programming languages they are targetted towards different ends of the programming spectrum.
Don't try to write C++ in Python; in fact, try to forget C++ when writing Python.
I found it far better to learn the common Python paradigms and techniques and apply them to my C++ programs than the other way around. | 0 | 43,679 | false | 0 | 1 | Python for C++ Developers | 328,689 |
1 | 2 | 0 | 1 | 5 | 0 | 0.099668 | 1 | I need to check whether a page is being redirected or not without actually downloading the content. I just need the final URL. What's the best way of doing this is Python?
Thanks! | 0 | python,http,http-headers | 2008-12-01T19:10:00.000 | 0 | 331,855 | When you open the URL with urllib2, and you're redirected, you get a status 30x for redirection. Check the info to see the location to which you're redirected. You don't need to read the page to read the info() that's part of the response. | 0 | 3,318 | false | 0 | 1 | How to determine if a page is being redirected | 331,871 |
6 | 8 | 1 | 0 | 13 | 0 | 0 | 0 | We know that Python provides a lot of productivity over any compiled languages. We have programming in C# & need to write the unit test cases in C# itself. If we see the amount of code we write for unit test is approximately ten times more than the original code.
Is it ideal choice to write unit test cases in IronPython instead of C#? Any body has done like that? I wrote few test cases, they seems to be good. But hairy pointy managers won't accept. | 0 | c#,python,unit-testing,ironpython | 2008-12-04T10:21:00.000 | 0 | 340,128 | Very interesting.
What would happen if you write all your code with IronPython (not just the unit tests)? Would you end up with approximately 10 times less code?
Maybe I should learn IronPython too. | 0 | 3,355 | false | 0 | 1 | IronPython For Unit Testing over C# | 342,457 |
6 | 8 | 1 | 0 | 13 | 0 | 0 | 0 | We know that Python provides a lot of productivity over any compiled languages. We have programming in C# & need to write the unit test cases in C# itself. If we see the amount of code we write for unit test is approximately ten times more than the original code.
Is it ideal choice to write unit test cases in IronPython instead of C#? Any body has done like that? I wrote few test cases, they seems to be good. But hairy pointy managers won't accept. | 0 | c#,python,unit-testing,ironpython | 2008-12-04T10:21:00.000 | 0 | 340,128 | I gotta go with Will and Jon..
I would prefer my tests be in the same language as the code I'm testing; it causes fewer cognitive context switches. But maybe I'm just not as mentally agile as I once was.
Jon | 0 | 3,355 | false | 0 | 1 | IronPython For Unit Testing over C# | 443,959 |
6 | 8 | 1 | 3 | 13 | 0 | 0.07486 | 0 | We know that Python provides a lot of productivity over any compiled languages. We have programming in C# & need to write the unit test cases in C# itself. If we see the amount of code we write for unit test is approximately ten times more than the original code.
Is it ideal choice to write unit test cases in IronPython instead of C#? Any body has done like that? I wrote few test cases, they seems to be good. But hairy pointy managers won't accept. | 0 | c#,python,unit-testing,ironpython | 2008-12-04T10:21:00.000 | 0 | 340,128 | Actually testing is a great opportunity to try integrating a new language. Languages like Python shine especially well in testing, and it's a low risk project to try - the worst case is not too bad at all.
As far as experience testing another language in Python, I've tested C and C++ systems like this and it was excellent. I think it's definitely worth a shot.
What Jon says is true, though - the level of tooling for Python in general, and IronPython in particular, is nowhere near that of C#. How much that affects you is something you'll find out in your pilot. | 0 | 3,355 | false | 0 | 1 | IronPython For Unit Testing over C# | 342,490 |
6 | 8 | 1 | 3 | 13 | 0 | 0.07486 | 0 | We know that Python provides a lot of productivity over any compiled languages. We have programming in C# & need to write the unit test cases in C# itself. If we see the amount of code we write for unit test is approximately ten times more than the original code.
Is it ideal choice to write unit test cases in IronPython instead of C#? Any body has done like that? I wrote few test cases, they seems to be good. But hairy pointy managers won't accept. | 0 | c#,python,unit-testing,ironpython | 2008-12-04T10:21:00.000 | 0 | 340,128 | Python being a much less verbose language than C# might actually lower the barrier to writing unit tests since there is still a lot of developers that are resistant to doing automated unit testing in general. Introducing and having them use a language like IronPython that typically tends to take less time to write the equivalent code in C# might actually encourage more unit tests to be written which is always a good thing.
Plus, by using IronPython for your test code, you might end up with less lines of code (LOC) for your project overall meaning that your unit tests might be more likely to be maintained in the long run versus being ignored and/or discarded. | 0 | 3,355 | false | 0 | 1 | IronPython For Unit Testing over C# | 341,683 |
6 | 8 | 1 | 6 | 13 | 0 | 1 | 0 | We know that Python provides a lot of productivity over any compiled languages. We have programming in C# & need to write the unit test cases in C# itself. If we see the amount of code we write for unit test is approximately ten times more than the original code.
Is it ideal choice to write unit test cases in IronPython instead of C#? Any body has done like that? I wrote few test cases, they seems to be good. But hairy pointy managers won't accept. | 0 | c#,python,unit-testing,ironpython | 2008-12-04T10:21:00.000 | 0 | 340,128 | Python is excellent for UnitTesting C# code. Our app is 75% in Python and 25% C#(Python.Net), and our unit tests are 100% python.
I find that it's much easier to make use of stubs and mocks in Python which is probably one of the most critical components that enable one to write effective unittests. | 0 | 3,355 | false | 0 | 1 | IronPython For Unit Testing over C# | 346,907 |
6 | 8 | 1 | 2 | 13 | 0 | 0.049958 | 0 | We know that Python provides a lot of productivity over any compiled languages. We have programming in C# & need to write the unit test cases in C# itself. If we see the amount of code we write for unit test is approximately ten times more than the original code.
Is it ideal choice to write unit test cases in IronPython instead of C#? Any body has done like that? I wrote few test cases, they seems to be good. But hairy pointy managers won't accept. | 0 | c#,python,unit-testing,ironpython | 2008-12-04T10:21:00.000 | 0 | 340,128 | There's an obvious disadvantage which is that everyone working on the code now needs to be proficient in two languages, not just one. I'm fairly hairy but not very pointy, but I do see why managers might be sceptical. | 0 | 3,355 | false | 0 | 1 | IronPython For Unit Testing over C# | 340,152 |
8 | 11 | 0 | 14 | 6 | 0 | 1.2 | 0 | I have some old apps written in PHP that I'm thinking of converting to Python - both are websites that started as simple static html, then progressed to PHP and now include blogs with admin areas, rss etc. I'm thinking of rewriting them in Python to improve maintainability as well as to take advantage of my increase in experience to write things more robustly.
Is this worth the effort? | 0 | php,python | 2008-12-04T11:48:00.000 | 0 | 340,318 | You need to take some parts into mind here,
What will you gain from re-writing
Is it an economically wise decision
Will the code be easier to handle for new programmers
Performance-wise, will this be a good option?
These four points is something that is important, will the work be more efficient after you re-write the code? Probably. But will it be worth the cost of re-development?
One important step to follow, if you decide to re-write, make 3 documents, first Analyze the project, what needs to be done? How should everything work? Then put up a document with Requirements, what specificly do we need and how should this be done? Last but not least, the design document, where you put all your final class diagrams, the system operations and how the design and flow of the page should work.
This will help a new developer, and old ones, to actually think about "do we really need to re-write?". | 0 | 1,366 | true | 1 | 1 | Is rewriting a PHP app into Python a productive step? | 340,338 |
8 | 11 | 0 | 1 | 6 | 0 | 0.01818 | 0 | I have some old apps written in PHP that I'm thinking of converting to Python - both are websites that started as simple static html, then progressed to PHP and now include blogs with admin areas, rss etc. I'm thinking of rewriting them in Python to improve maintainability as well as to take advantage of my increase in experience to write things more robustly.
Is this worth the effort? | 0 | php,python | 2008-12-04T11:48:00.000 | 0 | 340,318 | As others have said, re-writing will take a lot longer than you think and fixing all the bugs and making use everything worked like in the old version will take even longer. Chances are you are better off simply improving and refactoring the php code you have. There are only a few good reasons to port a project from one language to another:
Performance. Some languages are simply faster than others, and there comes a point where there is nothing left to optimize and throwing hardware at the problem ceases to be effective.
Maintainability. Sometimes it is hard to find good people who know some obscure language which your legacy code is written in. In those cases it might be a good idea to re-write it in a more popular language to ease maintenance down the road.
Porting to a different platform. If you all of a sudden need to make your old VB program run on OS X and Linux as well as Windows then you’re probably looking at a re-write in a different language
In your case it doesn't seem like any of the above points hold. Of course if it's an unimportant app and you want to do it for the learning experience then by all means go for it, but from a business or economic point of view I'd take a long hard look at what such a re-write will cost and what exactly you hope to gain. | 0 | 1,366 | false | 1 | 1 | Is rewriting a PHP app into Python a productive step? | 340,792 |
8 | 11 | 0 | 0 | 6 | 0 | 0 | 0 | I have some old apps written in PHP that I'm thinking of converting to Python - both are websites that started as simple static html, then progressed to PHP and now include blogs with admin areas, rss etc. I'm thinking of rewriting them in Python to improve maintainability as well as to take advantage of my increase in experience to write things more robustly.
Is this worth the effort? | 0 | php,python | 2008-12-04T11:48:00.000 | 0 | 340,318 | Other issues include how business critical are the applications and how hard will it be to find maintainers. If the pages are hobbies of yours then I don't see a reason why you shouldn't rewrite them since if you introduce bugs or the rewrite doesn't go according to schedule a business won't lose money. If the application is central to a business I wouldn’t rewrite it unless you are running into limitations with the current design that can not be overcome with out a complete rewrite at which point the language choice is secondary to the fact that you need to throw out several years of work because it’s not maintainable and no longer meets your needs. | 0 | 1,366 | false | 1 | 1 | Is rewriting a PHP app into Python a productive step? | 340,719 |
8 | 11 | 0 | 1 | 6 | 0 | 0.01818 | 0 | I have some old apps written in PHP that I'm thinking of converting to Python - both are websites that started as simple static html, then progressed to PHP and now include blogs with admin areas, rss etc. I'm thinking of rewriting them in Python to improve maintainability as well as to take advantage of my increase in experience to write things more robustly.
Is this worth the effort? | 0 | php,python | 2008-12-04T11:48:00.000 | 0 | 340,318 | If you are going to add more features to the code you already have working, then it might be a good idea to port it to python. After all, it will get you increased productivity. You just have to balance it, whether the rewriting task will not outweigh the potential gain...
And also, when you do that, try to unittest as much as you can. | 0 | 1,366 | false | 1 | 1 | Is rewriting a PHP app into Python a productive step? | 340,685 |
8 | 11 | 0 | 2 | 6 | 0 | 0.036348 | 0 | I have some old apps written in PHP that I'm thinking of converting to Python - both are websites that started as simple static html, then progressed to PHP and now include blogs with admin areas, rss etc. I'm thinking of rewriting them in Python to improve maintainability as well as to take advantage of my increase in experience to write things more robustly.
Is this worth the effort? | 0 | php,python | 2008-12-04T11:48:00.000 | 0 | 340,318 | Well, it depends... ;) If you're going to use the old code together with new Python code, it might be useful, not so much for speed but for easier integration. But usually: "If it ain't broke, don't fix it". Allso rewriting can result in better code, but only do it if you need to.
As a hobby project of course it's worth it, cause the process is the goal. | 0 | 1,366 | false | 1 | 1 | Is rewriting a PHP app into Python a productive step? | 340,342 |
8 | 11 | 0 | 1 | 6 | 0 | 0.01818 | 0 | I have some old apps written in PHP that I'm thinking of converting to Python - both are websites that started as simple static html, then progressed to PHP and now include blogs with admin areas, rss etc. I'm thinking of rewriting them in Python to improve maintainability as well as to take advantage of my increase in experience to write things more robustly.
Is this worth the effort? | 0 | php,python | 2008-12-04T11:48:00.000 | 0 | 340,318 | I did a conversion between a PHP site and a Turbogears(Python) site for my company. The initial reason for doing so was two fold, first so a redesign would be easier and second that features could be easily added. It did take a while to get the full conversion done, but what we end up with was a very flexible back end and an even more flexible and readable front end. We've added several features that would have been very hard in PHP and we are currently doing a complete overhaul of the front end, which is turning out to be very easy.
In short it's something I would recommend, and my boss would probably say the same thing. Some people here are making good points though. Python isn't as fast as what PHP can give you, but what it lacks in performance it more then makes up for in versatility. | 0 | 1,366 | false | 1 | 1 | Is rewriting a PHP app into Python a productive step? | 341,834 |
8 | 11 | 0 | 2 | 6 | 0 | 0.036348 | 0 | I have some old apps written in PHP that I'm thinking of converting to Python - both are websites that started as simple static html, then progressed to PHP and now include blogs with admin areas, rss etc. I'm thinking of rewriting them in Python to improve maintainability as well as to take advantage of my increase in experience to write things more robustly.
Is this worth the effort? | 0 | php,python | 2008-12-04T11:48:00.000 | 0 | 340,318 | As others have said, look at why you are doing it.
For instance, at work I am rewriting our existing inventory/sales system to a Python/django backend. Why? Because the existing PHP code base is stale, and is going to scale poorly as we grow our business (plus it was built when our business model was different, then patched up to match our current needs which resulted in some spaghetti code)
So basically, if you think you're going to benefit from it in ways that aren't just "sweet this is in python now!" then go for it. | 0 | 1,366 | false | 1 | 1 | Is rewriting a PHP app into Python a productive step? | 340,604 |
8 | 11 | 0 | 1 | 6 | 0 | 0.01818 | 0 | I have some old apps written in PHP that I'm thinking of converting to Python - both are websites that started as simple static html, then progressed to PHP and now include blogs with admin areas, rss etc. I'm thinking of rewriting them in Python to improve maintainability as well as to take advantage of my increase in experience to write things more robustly.
Is this worth the effort? | 0 | php,python | 2008-12-04T11:48:00.000 | 0 | 340,318 | Is your aim purely to improve the applications, or is it that you want to learn/work with Python?
If it's the first, I would say you should stick with PHP, since you already know that. | 0 | 1,366 | false | 1 | 1 | Is rewriting a PHP app into Python a productive step? | 340,334 |
3 | 4 | 0 | 1 | 3 | 0 | 0.049958 | 1 | I have a directory full (~103, 104) of XML files from which I need to extract the contents of several fields.
I've tested different xml parsers, and since I don't need to validate the contents (expensive) I was thinking of simply using xml.parsers.expat (the fastest one) to go through the files, one by one to extract the data.
Is there a more efficient way? (simple text matching doesn't work)
Do I need to issue a new ParserCreate() for each new file (or string) or can I reuse the same one for every file?
Any caveats?
Thanks! | 0 | python,xml,performance,large-files,expat-parser | 2008-12-05T17:15:00.000 | 0 | 344,559 | If you know that the XML files are generated using the ever-same algorithm, it might be more efficient to not do any XML parsing at all. E.g. if you know that the data is in lines 3, 4, and 5, you might read through the file line-by-line, and then use regular expressions.
Of course, that approach would fail if the files are not machine-generated, or originate from different generators, or if the generator changes over time. However, I'm optimistic that it would be more efficient.
Whether or not you recycle the parser objects is largely irrelevant. Many more objects will get created, so a single parser object doesn't really count much. | 0 | 572 | false | 0 | 1 | What is the most efficient way of extracting information from a large number of xml files in python? | 344,641 |
3 | 4 | 0 | 1 | 3 | 0 | 0.049958 | 1 | I have a directory full (~103, 104) of XML files from which I need to extract the contents of several fields.
I've tested different xml parsers, and since I don't need to validate the contents (expensive) I was thinking of simply using xml.parsers.expat (the fastest one) to go through the files, one by one to extract the data.
Is there a more efficient way? (simple text matching doesn't work)
Do I need to issue a new ParserCreate() for each new file (or string) or can I reuse the same one for every file?
Any caveats?
Thanks! | 0 | python,xml,performance,large-files,expat-parser | 2008-12-05T17:15:00.000 | 0 | 344,559 | One thing you didn't indicate is whether or not you're reading the XML into a DOM of some kind. I'm guessing that you're probably not, but on the off chance you are, don't. Use xml.sax instead. Using SAX instead of DOM will get you a significant performance boost. | 0 | 572 | false | 0 | 1 | What is the most efficient way of extracting information from a large number of xml files in python? | 345,650 |
3 | 4 | 0 | 3 | 3 | 0 | 1.2 | 1 | I have a directory full (~103, 104) of XML files from which I need to extract the contents of several fields.
I've tested different xml parsers, and since I don't need to validate the contents (expensive) I was thinking of simply using xml.parsers.expat (the fastest one) to go through the files, one by one to extract the data.
Is there a more efficient way? (simple text matching doesn't work)
Do I need to issue a new ParserCreate() for each new file (or string) or can I reuse the same one for every file?
Any caveats?
Thanks! | 0 | python,xml,performance,large-files,expat-parser | 2008-12-05T17:15:00.000 | 0 | 344,559 | The quickest way would be to match strings (with, e.g., regular expressions) instead of parsing XML - depending on your XMLs this could actually work.
But the most important thing is this: instead of thinking through several options, just implement them and time them on a small set. This will take roughly the same amount of time, and will give you real numbers do drive you forward.
EDIT:
Are the files on a local drive or network drive? Network I/O will kill you here.
The problem parallelizes trivially - you can split the work among several computers (or several processes on a multicore computer). | 0 | 572 | true | 0 | 1 | What is the most efficient way of extracting information from a large number of xml files in python? | 344,694 |
4 | 9 | 0 | 2 | 43 | 0 | 0.044415 | 1 | How can I receive and send email in python? A 'mail server' of sorts.
I am looking into making an app that listens to see if it receives an email addressed to [email protected], and sends an email to the sender.
Now, am I able to do this all in python, would it be best to use 3rd party libraries? | 0 | python,email | 2008-12-08T00:12:00.000 | 0 | 348,392 | Depending on the amount of mail you are sending you might want to look into using a real mail server like postifx or sendmail (*nix systems) Both of those programs have the ability to send a received mail to a program based on the email address. | 0 | 51,409 | false | 0 | 1 | Receive and send emails in python | 348,579 |
4 | 9 | 0 | 4 | 43 | 0 | 0.088656 | 1 | How can I receive and send email in python? A 'mail server' of sorts.
I am looking into making an app that listens to see if it receives an email addressed to [email protected], and sends an email to the sender.
Now, am I able to do this all in python, would it be best to use 3rd party libraries? | 0 | python,email | 2008-12-08T00:12:00.000 | 0 | 348,392 | poplib and smtplib will be your friends when developing your app. | 0 | 51,409 | false | 0 | 1 | Receive and send emails in python | 348,403 |
4 | 9 | 0 | 12 | 43 | 0 | 1 | 1 | How can I receive and send email in python? A 'mail server' of sorts.
I am looking into making an app that listens to see if it receives an email addressed to [email protected], and sends an email to the sender.
Now, am I able to do this all in python, would it be best to use 3rd party libraries? | 0 | python,email | 2008-12-08T00:12:00.000 | 0 | 348,392 | I do not think it would be a good idea to write a real mail server in Python. This is certainly possible (see mcrute's and Manuel Ceron's posts to have details) but it is a lot of work when you think of everything that a real mail server must handle (queuing, retransmission, dealing with spam, etc).
You should explain in more detail what you need. If you just want to react to incoming email, I would suggest to configure the mail server to call a program when it receives the email. This program could do what it wants (updating a database, creating a file, talking to another Python program).
To call an arbitrary program from the mail server, you have several choices:
For sendmail and Postfix, a ~/.forward containing "|/path/to/program"
If you use procmail, a recipe action of |path/to/program
And certainly many others | 0 | 51,409 | false | 0 | 1 | Receive and send emails in python | 349,352 |
4 | 9 | 0 | 7 | 43 | 0 | 1 | 1 | How can I receive and send email in python? A 'mail server' of sorts.
I am looking into making an app that listens to see if it receives an email addressed to [email protected], and sends an email to the sender.
Now, am I able to do this all in python, would it be best to use 3rd party libraries? | 0 | python,email | 2008-12-08T00:12:00.000 | 0 | 348,392 | Python has an SMTPD module that will be helpful to you for writing a server. You'll probably also want the SMTP module to do the re-send. Both modules are in the standard library at least since version 2.3. | 0 | 51,409 | false | 0 | 1 | Receive and send emails in python | 348,423 |
3 | 4 | 0 | 10 | 11 | 0 | 1.2 | 0 | I'm curious to know if there is an easy way to mock an IMAP server (a la the imaplib module) in Python, without doing a lot of work.
Is there a pre-existing solution? Ideally I could connect to the existing IMAP server, do a dump, and have the mock server run off the real mailbox/email structure.
Some background into the laziness: I have a nasty feeling that this small script I'm writing will grow over time and would like to create a proper testing environment, but given that it might not grow over time, I don't want to do much work to get the mock server running. | 0 | python,testing,imap,mocking | 2008-12-09T02:57:00.000 | 0 | 351,656 | I found it quite easy to write an IMAP server in twisted last time I tried. It comes with support for writing IMAP servers and you have a huge amount of flexibility. | 0 | 4,288 | true | 0 | 1 | How do I mock an IMAP server in Python, despite extreme laziness? | 351,675 |
3 | 4 | 0 | 7 | 11 | 0 | 1 | 0 | I'm curious to know if there is an easy way to mock an IMAP server (a la the imaplib module) in Python, without doing a lot of work.
Is there a pre-existing solution? Ideally I could connect to the existing IMAP server, do a dump, and have the mock server run off the real mailbox/email structure.
Some background into the laziness: I have a nasty feeling that this small script I'm writing will grow over time and would like to create a proper testing environment, but given that it might not grow over time, I don't want to do much work to get the mock server running. | 0 | python,testing,imap,mocking | 2008-12-09T02:57:00.000 | 0 | 351,656 | How much of it do you really need for any one test? If you start to build something on the order of complexity of a real server so that you can use it on all your tests, you've already gone wrong. Just mock the bits any one tests needs.
Don't bother trying so hard to share a mock implementation. They're not supposed to be assets, but discardable bits-n-pieces. | 0 | 4,288 | false | 0 | 1 | How do I mock an IMAP server in Python, despite extreme laziness? | 353,175 |
3 | 4 | 0 | 1 | 11 | 0 | 0.049958 | 0 | I'm curious to know if there is an easy way to mock an IMAP server (a la the imaplib module) in Python, without doing a lot of work.
Is there a pre-existing solution? Ideally I could connect to the existing IMAP server, do a dump, and have the mock server run off the real mailbox/email structure.
Some background into the laziness: I have a nasty feeling that this small script I'm writing will grow over time and would like to create a proper testing environment, but given that it might not grow over time, I don't want to do much work to get the mock server running. | 0 | python,testing,imap,mocking | 2008-12-09T02:57:00.000 | 0 | 351,656 | I never tried but, if I had to, I would start with the existing SMTP server. | 0 | 4,288 | false | 0 | 1 | How do I mock an IMAP server in Python, despite extreme laziness? | 352,194 |
9 | 11 | 0 | 31 | 169 | 1 | 1 | 0 | I'm trying to get started with unit testing in Python and I was wondering if someone could explain the advantages and disadvantages of doctest and unittest.
What conditions would you use each for? | 0 | python,unit-testing,comparison,doctest | 2008-12-12T01:50:00.000 | 0 | 361,675 | I work as a bioinformatician, and most of the code I write is "one time, one task" scripts, code that will be run only once or twice and that execute a single specific task.
In this situation, writing big unittests may be overkill, and doctests are an useful compromise. They are quicker to write, and since they are usually incorporated in the code, they allow to always keep an eye on how the code should behave, without having to have another file open. That's useful when writing small script.
Also, doctests are useful when you have to pass your script to a researcher that is not expert in programming. Some people find it very difficult to understand how unittests are structured; on the other hand, doctests are simple examples of usage, so people can just copy and paste them to see how to use them.
So, to resume my answer: doctests are useful when you have to write small scripts, and when you have to pass them or show them to researchers that are not computer scientists. | 0 | 29,175 | false | 0 | 1 | Python - doctest vs. unittest | 10,861,736 |
9 | 11 | 0 | 37 | 169 | 1 | 1 | 0 | I'm trying to get started with unit testing in Python and I was wondering if someone could explain the advantages and disadvantages of doctest and unittest.
What conditions would you use each for? | 0 | python,unit-testing,comparison,doctest | 2008-12-12T01:50:00.000 | 0 | 361,675 | Another advantage of doctesting is that you get to make sure your code does what your documentation says it does. After a while, software changes can make your documentation and code do different things. :-) | 0 | 29,175 | false | 0 | 1 | Python - doctest vs. unittest | 667,829 |
9 | 11 | 0 | 7 | 169 | 1 | 1 | 0 | I'm trying to get started with unit testing in Python and I was wondering if someone could explain the advantages and disadvantages of doctest and unittest.
What conditions would you use each for? | 0 | python,unit-testing,comparison,doctest | 2008-12-12T01:50:00.000 | 0 | 361,675 | Using both is a valid and rather simple option. The doctest module provides the DoctTestSuite and DocFileSuite methods which create a unittest-compatible testsuite from a module or file, respectively.
So I use both and typically use doctest for simple tests with functions that require little or no setup (simple types for arguments). I actually think a few doctest tests help document the function, rather than detract from it.
But for more complicated cases, and for a more comprehensive set of test cases, I use unittest which provides more control and flexibility. | 0 | 29,175 | false | 0 | 1 | Python - doctest vs. unittest | 361,788 |
9 | 11 | 0 | 7 | 169 | 1 | 1 | 0 | I'm trying to get started with unit testing in Python and I was wondering if someone could explain the advantages and disadvantages of doctest and unittest.
What conditions would you use each for? | 0 | python,unit-testing,comparison,doctest | 2008-12-12T01:50:00.000 | 0 | 361,675 | I use unittest exclusively; I think doctest clutters up the main module too much. This probably has to do with writing thorough tests. | 0 | 29,175 | false | 0 | 1 | Python - doctest vs. unittest | 361,703 |
9 | 11 | 0 | 50 | 169 | 1 | 1 | 0 | I'm trying to get started with unit testing in Python and I was wondering if someone could explain the advantages and disadvantages of doctest and unittest.
What conditions would you use each for? | 0 | python,unit-testing,comparison,doctest | 2008-12-12T01:50:00.000 | 0 | 361,675 | I use unittest almost exclusively.
Once in a while, I'll put some stuff in a docstring that's usable by doctest.
95% of the test cases are unittest.
Why? I like keeping docstrings somewhat shorter and more to the point. Sometimes test cases help clarify a docstring. Most of the time, the application's test cases are too long for a docstring. | 0 | 29,175 | false | 0 | 1 | Python - doctest vs. unittest | 361,680 |
9 | 11 | 0 | 8 | 169 | 1 | 1 | 0 | I'm trying to get started with unit testing in Python and I was wondering if someone could explain the advantages and disadvantages of doctest and unittest.
What conditions would you use each for? | 0 | python,unit-testing,comparison,doctest | 2008-12-12T01:50:00.000 | 0 | 361,675 | I don't use doctest as a replacement for unittest. Although they overlap a bit, the two modules don't have the same function:
I use unittest as a unit testing framework, meaning it helps me determine quickly the impact of any modification on the rest of the code.
I use doctest as a guarantee that comments (namely docstrings) are still relevant to current version of the code.
The widely documented benefits of test driven development I get from unittest. doctest solves the far more subtle danger of having outdated comments misleading the maintenance of the code. | 0 | 29,175 | false | 0 | 1 | Python - doctest vs. unittest | 13,722,080 |
9 | 11 | 0 | 4 | 169 | 1 | 0.072599 | 0 | I'm trying to get started with unit testing in Python and I was wondering if someone could explain the advantages and disadvantages of doctest and unittest.
What conditions would you use each for? | 0 | python,unit-testing,comparison,doctest | 2008-12-12T01:50:00.000 | 0 | 361,675 | I almost never use doctests. I want my code to be self documenting, and the docstrings provide the documentation to the user. IMO adding hundreds of lines of tests to a module makes the docstrings far less readable. I also find unit tests easier to modify when needed. | 0 | 29,175 | false | 0 | 1 | Python - doctest vs. unittest | 361,698 |
9 | 11 | 0 | 3 | 169 | 1 | 0.054491 | 0 | I'm trying to get started with unit testing in Python and I was wondering if someone could explain the advantages and disadvantages of doctest and unittest.
What conditions would you use each for? | 0 | python,unit-testing,comparison,doctest | 2008-12-12T01:50:00.000 | 0 | 361,675 | I prefer the discovery based systems ("nose" and "py.test", using the former currently).
doctest is nice when the test is also good as a documentation, otherwise they tend to clutter the code too much. | 0 | 29,175 | false | 0 | 1 | Python - doctest vs. unittest | 361,886 |
9 | 11 | 0 | 14 | 169 | 1 | 1 | 0 | I'm trying to get started with unit testing in Python and I was wondering if someone could explain the advantages and disadvantages of doctest and unittest.
What conditions would you use each for? | 0 | python,unit-testing,comparison,doctest | 2008-12-12T01:50:00.000 | 0 | 361,675 | If you're just getting started with the idea of unit testing, I would start with doctest because it is so simple to use. It also naturally provides some level of documentation. And for more comprehensive testing with doctest, you can place tests in an external file so it doesn't clutter up your documentation.
I would suggest unittest if you're coming from a background of having used JUnit or something similar, where you want to be able to write unit tests in generally the same way as you have been elsewhere. | 0 | 29,175 | false | 0 | 1 | Python - doctest vs. unittest | 361,683 |
3 | 7 | 0 | 0 | 4 | 0 | 0 | 0 | I'd like to try out Eclipse, but I'm a bit baffled with all the different distributions of it. I mainly program in Python doing web development, but I also need to maintain PHP and Perl apps. It looks like EasyEclipse is a bit behind. Should I just grab the base Eclipse and start loading plug-ins? | 0 | php,python,perl,eclipse | 2008-12-12T23:26:00.000 | 0 | 364,486 | I develop in PHP, python, C(python modules), SQL and JS/HTML/CSS all on eclipse. I do this
by installing PDT, CDT, pydev and SQL tools onto the eclipse-platform, and then using different workspaces for mixed projects.
Two workspaces to be specific, one for PHP web development and another for Python/C. I do run it on a rather powerful machine so I allow eclipse the luxury of added memory (2G).
Works like a charm and it is very nice to be able to use the same IDE for everything :) | 0 | 2,130 | false | 1 | 1 | Which Eclipse distribution is good for web development using Python, PHP, or Perl? | 6,481,753 |
3 | 7 | 0 | 0 | 4 | 0 | 0 | 0 | I'd like to try out Eclipse, but I'm a bit baffled with all the different distributions of it. I mainly program in Python doing web development, but I also need to maintain PHP and Perl apps. It looks like EasyEclipse is a bit behind. Should I just grab the base Eclipse and start loading plug-ins? | 0 | php,python,perl,eclipse | 2008-12-12T23:26:00.000 | 0 | 364,486 | I use the javascript eclipse helios and added pydev plugin to it for django support it seems to do everything I need. | 0 | 2,130 | false | 1 | 1 | Which Eclipse distribution is good for web development using Python, PHP, or Perl? | 18,023,885 |
3 | 7 | 0 | 0 | 4 | 0 | 0 | 0 | I'd like to try out Eclipse, but I'm a bit baffled with all the different distributions of it. I mainly program in Python doing web development, but I also need to maintain PHP and Perl apps. It looks like EasyEclipse is a bit behind. Should I just grab the base Eclipse and start loading plug-ins? | 0 | php,python,perl,eclipse | 2008-12-12T23:26:00.000 | 0 | 364,486 | PyDev is pretty decent as I'm sure you know. It can fit on top of all the Eclipse distributions (provided they meet the minimum version requirements). If you're doing webdev stuff, you'll probably find the closest fit with Aptana.
That said, I find Aptana hideously clunky when compared to a decent text editor. I build sites using django and for that I use Eclipse (pure) and PyDev to do the python and gedit (gnome's souped up notepad) for writing the HTML for templates/CSS/JS/etc.
At the end of the day, whatever suits you best is what you'll go with. | 0 | 2,130 | false | 1 | 1 | Which Eclipse distribution is good for web development using Python, PHP, or Perl? | 364,517 |
4 | 5 | 0 | 17 | 274 | 1 | 1 | 0 | What are the main differences among them? And in which typical scenarios is it better to use each language? | 0 | python,perl,sed,awk,language-comparisons | 2008-12-14T21:00:00.000 | 0 | 366,980 | First, there are two unrelated things in the list "Perl, Python awk and sed".
Thing 1 - simplistic text manipulation tools.
sed. It has a fixed, relatively simple scope of work defined by the idea of reading and examining each line of a file. sed is not designed to be particularly readable. It is designed to be very small and very efficient on very tiny unix servers.
awk. It has a slightly less fixed, less simple scope of work. However, the main loop of an awk program is defined by the implicit reading of lines of a source file.
These are not "complete" programming languages. While you can -- with some work -- write fairly sophisticated programs in awk, it rapidly gets complicated and difficult to read.
Thing 2 - general-purposes programming languages. These have a rich variety of statement types, numerous built-in data structures, and no wired-in assumptions or shortcuts to speak of.
Perl.
Python.
When to use them.
sed. Never. It really doesn't have any value in the modern era of computers with more than 32K of memory. Perl or Python do the same things more clearly.
awk. Never. Like sed, it reflects an earlier era of computing. Rather than maintain this language (in addition to all the other required for a successful system), it's more pleasant to simply do everything in one pleasant language.
Perl. Any programming problem of any kind. If you like free-thinking syntax, where there are many, many ways to do the same thing, perl is fun.
Python. Any programming problem of any kind. If you like fairly limited syntax, where there are fewer choices, less subtlety, and (perhaps) more clarity. Python's object-oriented nature makes it more suitable for large, complex problems.
Background -- I'm not bashing sed and awk out of ignorance. I learned awk over 20 years ago. Did many things with it; used to teach it as a core unix skill. I learned Perl about 15 years ago. Did many sophisticated things with it. I've left both behind because I can do the same things in Python -- and it is simpler and more clear.
There are two serious problems with sed and awk, neither of which are their age.
The incompleteness of their implementation. Everything sed and awk do can be done in Python or Perl, often more simply and sometimes faster, too. A shell pipeline has some performance advantages because of its multi-processing. Python offers a subprocess module to allow me to recover those advantages.
The need to learn yet another language. By doing things in Python (or Perl) your implementation depends on fewer languages, with a resulting increase in clarity. | 0 | 86,075 | false | 0 | 1 | What are the differences between Perl, Python, AWK and sed? | 367,082 |
4 | 5 | 0 | 588 | 274 | 1 | 1.2 | 0 | What are the main differences among them? And in which typical scenarios is it better to use each language? | 0 | python,perl,sed,awk,language-comparisons | 2008-12-14T21:00:00.000 | 0 | 366,980 | In order of appearance, the languages are sed, awk, perl, python.
The sed program is a stream editor and is designed to apply the actions from a script to each line (or, more generally, to specified ranges of lines) of the input file or files. Its language is based on ed, the Unix editor, and although it has conditionals and so on, it is hard to work with for complex tasks. You can work minor miracles with it - but at a cost to the hair on your head. However, it is probably the fastest of the programs when attempting tasks within its remit. (It has the least powerful regular expressions of the programs discussed - adequate for many purposes, but certainly not PCRE - Perl-Compatible Regular Expressions)
The awk program (name from the initials of its authors - Aho, Weinberger, and Kernighan) is a tool initially for formatting reports. It can be used as a souped-up sed; in its more recent versions, it is computationally complete. It uses an interesting idea - the program is based on 'patterns matched' and 'actions taken when the pattern matches'. The patterns are fairly powerful (Extended Regular Expressions). The language for the actions is similar to C. One of the key features of awk is that it splits the input automatically into records and each record into fields.
Perl was written in part as an awk-killer and sed-killer. Two of the programs provided with it are a2p and s2p for converting awk scripts and sed scripts into Perl. Perl is one of the earliest of the next generation of scripting languages (Tcl/Tk can probably claim primacy). It has powerful integrated regular expression handling with a vastly more powerful language. It provides access to almost all system calls and has the extensibility of the CPAN modules. (Neither awk nor sed is extensible.) One of Perl's mottos is "TMTOWTDI - There's more than one way to do it" (pronounced "tim-toady"). Perl has 'objects', but it is more of an add-on than a fundamental part of the language.
Python was written last, and probably in part as a reaction to Perl. It has some interesting syntactic ideas (indenting to indicate levels - no braces or equivalents). It is more fundamentally object-oriented than Perl; it is just as extensible as Perl.
OK - when to use each?
Sed - when you need to do simple text transforms on files.
Awk - when you only need simple formatting and summarisation or transformation of data.
Perl - for almost any task, but especially when the task needs complex regular expressions.
Python - for the same tasks that you could use Perl for.
I'm not aware of anything that Perl can do that Python can't, nor vice versa. The choice between the two would depend on other factors. I learned Perl before there was a Python, so I tend to use it. Python has less accreted syntax and is generally somewhat simpler to learn. Perl 6, when it becomes available, will be a fascinating development.
(Note that the 'overviews' of Perl and Python, in particular, are woefully incomplete; whole books could be written on the topic.) | 0 | 86,075 | true | 0 | 1 | What are the differences between Perl, Python, AWK and sed? | 367,014 |
4 | 5 | 0 | 100 | 274 | 1 | 1 | 0 | What are the main differences among them? And in which typical scenarios is it better to use each language? | 0 | python,perl,sed,awk,language-comparisons | 2008-12-14T21:00:00.000 | 0 | 366,980 | After mastering a few dozen languages, you get tired of people like S. Lott (see his controversial answer to this question, nearly half as many down-votes as up (+45/-22) six years after answering).
Sed is the best tool for extremely simple command-line pipelines. In the hands of a sed master, it's suitable for one-offs of arbitrary complexity, but it should not be used in production code except in very simple substitution pipelines. Stuff like 's/this/that/.'
Gawk (the GNU awk) is by far the best choice for complex data reformatting when there is only a single input source and a single output (or, multiple outputs sequentially written). Since a great deal of real-world work conforms to this description, and a good programmer can learn gawk in two hours, it is the best choice. On this planet, simpler and faster is better!
Perl or Python are far better than any version of awk or sed when you have very complex input/output scenarios. The more complex the problem is, the better off you are using python, from a maintenance and readability standpoint. Note, however, that a good programmer can write readable code in any language, and a bad programmer can write unmaintainable crap in any useful language, so the choice of perl or python can safely be left to the preferences of the programmer if said programmer is skilled and clever. | 0 | 86,075 | false | 0 | 1 | What are the differences between Perl, Python, AWK and sed? | 2,905,791 |
4 | 5 | 0 | 22 | 274 | 1 | 1 | 0 | What are the main differences among them? And in which typical scenarios is it better to use each language? | 0 | python,perl,sed,awk,language-comparisons | 2008-12-14T21:00:00.000 | 0 | 366,980 | I wouldn't call sed a fully-fledged programming language, it is a stream editor with language constructs aimed at editing text files programmatically.
Awk is a little more of a general purpose language but it is still best suited for text processing.
Perl and Python are fully fledged, general purpose programming languages. Perl has its roots in text processing and has a number of awk-like constructs (there is even an awk-to-perl script floating around on the net). There are many differences between Perl and Python, your best bet is probably to read the summaries of both languages on something like Wikipedia to get a good grasp on what they are. | 0 | 86,075 | false | 0 | 1 | What are the differences between Perl, Python, AWK and sed? | 367,002 |
1 | 6 | 0 | 10 | 17 | 1 | 1.2 | 0 | I know of python -c '<code>', but I'm wondering if there's a more elegant python equivalent to perl -pi -e '<code>'. I still use it quite a bit for things like find and replace in a whole directory (perl -pi -e s/foo/bar/g * or even find . | xargs perl -pi -e s/foo/bar/g for sub-directories).
I actually feel that that which makes Perl Perl (free form Tim Toady-ness) is what makes perl -pi -e work so well, while with Python you'd have to do something along the lines of importing the re module, creating an re instance and then capture stdin, but maybe there's a Python shortcut that does all that and I missed it (sorely missed it)... | 0 | python,perl,command-line,language-features | 2008-12-14T22:57:00.000 | 0 | 367,115 | The command line usage from 'python -h' certainly strongly suggests there is no such equivalent. Perl tends to make extensive use of '$_' (your examples make implicit use of it), and I don't think Python supports any similar concept, thereby making Python equivalents of the Perl one-liners much harder. | 0 | 3,183 | true | 0 | 1 | Is there a Python equivalent to `perl -pi -e`? | 367,181 |
1 | 6 | 0 | -1 | 2 | 1 | -0.033321 | 0 | I want something simple in order to experiment/hack. I've created a lot interpreters/compilers for c and I just want something simple. A basic BASIC :D
If you don't know any (I've done my google search...), yacc/bison is the only way?
Thx | 0 | python,ruby,interpreter | 2008-12-15T19:12:00.000 | 0 | 369,391 | There is pybasic (python basic), rockit-minibasic (rubybasic).
To make these able to use the gui, then one has to develop extensions with kivy and shoes gui toolkits for pybasic and rockit-minibasic respectively and similarly prima gui for perlbasic if ever exists. | 0 | 2,470 | false | 0 | 1 | Is there an OpenSource BASIC interpreter in Ruby/Python? | 12,280,987 |
1 | 2 | 0 | 3 | 0 | 0 | 0.291313 | 0 | How would I only allow users authenticated via Python code to access certain files on the server?
For instance, say I have /static/book.txt which I want to protect. When a user accesses /some/path/that/validates/him, a Python script deems him worthy of accessing /static/book.txt and redirects him to that path.
How would I stop users who bypass the script and directly access /static/book.txt? | 0 | python,security,apache,download,lighttpd | 2008-12-16T19:50:00.000 | 0 | 372,465 | You might want to just have your Python script open the file and dump the contents as its output if the user is properly authenticated. Put the files you want to protect in a folder that is outside of the webserver root. | 0 | 274 | false | 0 | 1 | Protecting online static content | 372,488 |
11 | 14 | 0 | 1 | 2 | 1 | 0.014285 | 0 | I have a large set of data (a data cube of 250,000 X 1,000 doubles, about a 4 gig file) and I want to manipulate it using a previous set of OOP classes I have written in Python. Currently the data set is already so large that to read into my machine memory I have to at least split it in half so computing overhead is a concern. My OOP classes create new objects (in this case I will need 250,000 new objects, each object is an array of 1,000 doubles) to handle the data. What is the overhead in terms of memory and computing required in creating objects for a generic OOP language? In python? What about in C++?
Yes, I realize I could make a new class that is an array. But 1) I already have these classes finished and 2) I put each object that I create back into an array for access later anyways. The question is pedagogical
*update: I want to be efficient with time, my time and the computers. I don't want to rewrite a program I already have if I don't have to and spending time optimizing the code wastes my time, I don't care that much if I waste the computers time. I actually do have a 64bit machine with 4Gig ram. The data is an image and I need to do several filters on each pixel.* | 0 | python,oop,data-analysis | 2008-12-16T20:04:00.000 | 0 | 372,511 | Actual C++ OO memory overhead is one pointer (4-8 bytes, depending) per object with virtual methods. However, as mentioned in other answers, the default memory allocation overhead from dynamic allocation is likely to be significantly greater than this.
If you're doing things halfway reasonably, neither overhead is likely to be significant compared with an 1000*8-byte double array. If you're actually worried about allocation overhead, you can write your own allocator -- but, check first to see if it will actually buy you a significant improvement. | 0 | 3,461 | false | 0 | 1 | What is the object oriented programming computing overhead cost? | 372,779 |
11 | 14 | 0 | 0 | 2 | 1 | 0 | 0 | I have a large set of data (a data cube of 250,000 X 1,000 doubles, about a 4 gig file) and I want to manipulate it using a previous set of OOP classes I have written in Python. Currently the data set is already so large that to read into my machine memory I have to at least split it in half so computing overhead is a concern. My OOP classes create new objects (in this case I will need 250,000 new objects, each object is an array of 1,000 doubles) to handle the data. What is the overhead in terms of memory and computing required in creating objects for a generic OOP language? In python? What about in C++?
Yes, I realize I could make a new class that is an array. But 1) I already have these classes finished and 2) I put each object that I create back into an array for access later anyways. The question is pedagogical
*update: I want to be efficient with time, my time and the computers. I don't want to rewrite a program I already have if I don't have to and spending time optimizing the code wastes my time, I don't care that much if I waste the computers time. I actually do have a 64bit machine with 4Gig ram. The data is an image and I need to do several filters on each pixel.* | 0 | python,oop,data-analysis | 2008-12-16T20:04:00.000 | 0 | 372,511 | Since you can split the data in half and operate on it, I'm assuming that you're working on each record individually? It sounds to me like you need to change your deserialiser to read one record at a time, manipulate it, and then store out the results.
Basically you need a string parser class that does a Peek() which returns a char, knows how to skip whitespace, etc. Wrap a class around that that understands your data format, and you should be able to have it spit out an object at a time as it reads the file. | 0 | 3,461 | false | 0 | 1 | What is the object oriented programming computing overhead cost? | 372,848 |
11 | 14 | 0 | 0 | 2 | 1 | 0 | 0 | I have a large set of data (a data cube of 250,000 X 1,000 doubles, about a 4 gig file) and I want to manipulate it using a previous set of OOP classes I have written in Python. Currently the data set is already so large that to read into my machine memory I have to at least split it in half so computing overhead is a concern. My OOP classes create new objects (in this case I will need 250,000 new objects, each object is an array of 1,000 doubles) to handle the data. What is the overhead in terms of memory and computing required in creating objects for a generic OOP language? In python? What about in C++?
Yes, I realize I could make a new class that is an array. But 1) I already have these classes finished and 2) I put each object that I create back into an array for access later anyways. The question is pedagogical
*update: I want to be efficient with time, my time and the computers. I don't want to rewrite a program I already have if I don't have to and spending time optimizing the code wastes my time, I don't care that much if I waste the computers time. I actually do have a 64bit machine with 4Gig ram. The data is an image and I need to do several filters on each pixel.* | 0 | python,oop,data-analysis | 2008-12-16T20:04:00.000 | 0 | 372,511 | Impossible to answer without knowing the shape of the data and the structure that you've designed to contain it. | 0 | 3,461 | false | 0 | 1 | What is the object oriented programming computing overhead cost? | 372,517 |
11 | 14 | 0 | 0 | 2 | 1 | 0 | 0 | I have a large set of data (a data cube of 250,000 X 1,000 doubles, about a 4 gig file) and I want to manipulate it using a previous set of OOP classes I have written in Python. Currently the data set is already so large that to read into my machine memory I have to at least split it in half so computing overhead is a concern. My OOP classes create new objects (in this case I will need 250,000 new objects, each object is an array of 1,000 doubles) to handle the data. What is the overhead in terms of memory and computing required in creating objects for a generic OOP language? In python? What about in C++?
Yes, I realize I could make a new class that is an array. But 1) I already have these classes finished and 2) I put each object that I create back into an array for access later anyways. The question is pedagogical
*update: I want to be efficient with time, my time and the computers. I don't want to rewrite a program I already have if I don't have to and spending time optimizing the code wastes my time, I don't care that much if I waste the computers time. I actually do have a 64bit machine with 4Gig ram. The data is an image and I need to do several filters on each pixel.* | 0 | python,oop,data-analysis | 2008-12-16T20:04:00.000 | 0 | 372,511 | A friend of mine was a professor at MIT and a student asked him why his image analysis program was running so slow. How was it built? Every pixel was an object, and would send messages to its neighbors!
If I were you I'd try it in a throw-away program. My suspicion is, unless your classes are very carefully coded, you're going to find it spending a lot of time allocating, initializing, and de-allocating objects, and as Brian said, you might be able to spool the data through a set of re-used objects.
Edit: Excuse me. You said you are re-using objects, so that's good. In any case, when you get it running you could profile it or (if you were me) just read the call stack a few random times, and that will answer any questions about where the time goes. | 0 | 3,461 | false | 0 | 1 | What is the object oriented programming computing overhead cost? | 372,801 |
11 | 14 | 0 | 0 | 2 | 1 | 0 | 0 | I have a large set of data (a data cube of 250,000 X 1,000 doubles, about a 4 gig file) and I want to manipulate it using a previous set of OOP classes I have written in Python. Currently the data set is already so large that to read into my machine memory I have to at least split it in half so computing overhead is a concern. My OOP classes create new objects (in this case I will need 250,000 new objects, each object is an array of 1,000 doubles) to handle the data. What is the overhead in terms of memory and computing required in creating objects for a generic OOP language? In python? What about in C++?
Yes, I realize I could make a new class that is an array. But 1) I already have these classes finished and 2) I put each object that I create back into an array for access later anyways. The question is pedagogical
*update: I want to be efficient with time, my time and the computers. I don't want to rewrite a program I already have if I don't have to and spending time optimizing the code wastes my time, I don't care that much if I waste the computers time. I actually do have a 64bit machine with 4Gig ram. The data is an image and I need to do several filters on each pixel.* | 0 | python,oop,data-analysis | 2008-12-16T20:04:00.000 | 0 | 372,511 | Please define "manipulate". If you really want to manipulate 4 gigs of data why do you want to manipulate it by pulling it ALL into memory right away?
I mean, who needs 4 gig of RAM anyway? :) | 0 | 3,461 | false | 0 | 1 | What is the object oriented programming computing overhead cost? | 372,724 |
11 | 14 | 0 | 2 | 2 | 1 | 0.028564 | 0 | I have a large set of data (a data cube of 250,000 X 1,000 doubles, about a 4 gig file) and I want to manipulate it using a previous set of OOP classes I have written in Python. Currently the data set is already so large that to read into my machine memory I have to at least split it in half so computing overhead is a concern. My OOP classes create new objects (in this case I will need 250,000 new objects, each object is an array of 1,000 doubles) to handle the data. What is the overhead in terms of memory and computing required in creating objects for a generic OOP language? In python? What about in C++?
Yes, I realize I could make a new class that is an array. But 1) I already have these classes finished and 2) I put each object that I create back into an array for access later anyways. The question is pedagogical
*update: I want to be efficient with time, my time and the computers. I don't want to rewrite a program I already have if I don't have to and spending time optimizing the code wastes my time, I don't care that much if I waste the computers time. I actually do have a 64bit machine with 4Gig ram. The data is an image and I need to do several filters on each pixel.* | 0 | python,oop,data-analysis | 2008-12-16T20:04:00.000 | 0 | 372,511 | I wouldn't consider it fair to blame any shortcomings of your design to OOP. Just like any other programming platform out there OO can be used for both good and less than optimal design. Rarely will this be the fault of the programming model itself.
But to try to answer your question: Allocating 250000 new object requires some overhead in all OO language that I'm aware of, so if you can get away with streaming the data through the same instance, you're probably better off. | 0 | 3,461 | false | 0 | 1 | What is the object oriented programming computing overhead cost? | 372,524 |
11 | 14 | 0 | 0 | 2 | 1 | 0 | 0 | I have a large set of data (a data cube of 250,000 X 1,000 doubles, about a 4 gig file) and I want to manipulate it using a previous set of OOP classes I have written in Python. Currently the data set is already so large that to read into my machine memory I have to at least split it in half so computing overhead is a concern. My OOP classes create new objects (in this case I will need 250,000 new objects, each object is an array of 1,000 doubles) to handle the data. What is the overhead in terms of memory and computing required in creating objects for a generic OOP language? In python? What about in C++?
Yes, I realize I could make a new class that is an array. But 1) I already have these classes finished and 2) I put each object that I create back into an array for access later anyways. The question is pedagogical
*update: I want to be efficient with time, my time and the computers. I don't want to rewrite a program I already have if I don't have to and spending time optimizing the code wastes my time, I don't care that much if I waste the computers time. I actually do have a 64bit machine with 4Gig ram. The data is an image and I need to do several filters on each pixel.* | 0 | python,oop,data-analysis | 2008-12-16T20:04:00.000 | 0 | 372,511 | I don't think the question is overhead coming from OO.
If we accept C++ as an OO language and remember that the C++ compiler is a preprocessor to C (at least it used to be, when I used C++), anything done in C++ is really done in C. C has very little overhead. So it would depend on the libraries.
I think any overhead would come from interpretation, managed execution or memory management. For those that have the tools and the know-how, it would be very easy to find out which is most efficient, C++ or Python.
I can't see where C++ would add much avoidable overhead. I don't know much about Python. | 0 | 3,461 | false | 0 | 1 | What is the object oriented programming computing overhead cost? | 372,658 |
11 | 14 | 0 | 0 | 2 | 1 | 0 | 0 | I have a large set of data (a data cube of 250,000 X 1,000 doubles, about a 4 gig file) and I want to manipulate it using a previous set of OOP classes I have written in Python. Currently the data set is already so large that to read into my machine memory I have to at least split it in half so computing overhead is a concern. My OOP classes create new objects (in this case I will need 250,000 new objects, each object is an array of 1,000 doubles) to handle the data. What is the overhead in terms of memory and computing required in creating objects for a generic OOP language? In python? What about in C++?
Yes, I realize I could make a new class that is an array. But 1) I already have these classes finished and 2) I put each object that I create back into an array for access later anyways. The question is pedagogical
*update: I want to be efficient with time, my time and the computers. I don't want to rewrite a program I already have if I don't have to and spending time optimizing the code wastes my time, I don't care that much if I waste the computers time. I actually do have a 64bit machine with 4Gig ram. The data is an image and I need to do several filters on each pixel.* | 0 | python,oop,data-analysis | 2008-12-16T20:04:00.000 | 0 | 372,511 | Like the other posters have stated. I do not believe Objects are going to lend a significant amount of overhead to your process. It will need to store a pointer to the object but the rest of the 'doubles' will be taking 99% of your program's memory.
Can you partition this data into much smaller subsets? What is the task that you are trying to accomplish? I would be interested in seeing what you need all the data in memory for. Perhaps you can just serialize it, or use something like lazy evaluation in haskell.
Please post a follow up so we can understand your problem domain better. | 0 | 3,461 | false | 0 | 1 | What is the object oriented programming computing overhead cost? | 372,641 |
11 | 14 | 0 | 3 | 2 | 1 | 0.042831 | 0 | I have a large set of data (a data cube of 250,000 X 1,000 doubles, about a 4 gig file) and I want to manipulate it using a previous set of OOP classes I have written in Python. Currently the data set is already so large that to read into my machine memory I have to at least split it in half so computing overhead is a concern. My OOP classes create new objects (in this case I will need 250,000 new objects, each object is an array of 1,000 doubles) to handle the data. What is the overhead in terms of memory and computing required in creating objects for a generic OOP language? In python? What about in C++?
Yes, I realize I could make a new class that is an array. But 1) I already have these classes finished and 2) I put each object that I create back into an array for access later anyways. The question is pedagogical
*update: I want to be efficient with time, my time and the computers. I don't want to rewrite a program I already have if I don't have to and spending time optimizing the code wastes my time, I don't care that much if I waste the computers time. I actually do have a 64bit machine with 4Gig ram. The data is an image and I need to do several filters on each pixel.* | 0 | python,oop,data-analysis | 2008-12-16T20:04:00.000 | 0 | 372,511 | You'd have similar issues with procedural/functional programming languages. How do you store that much data in memory? A struct or array wouldn't work either.
You need to take special steps to manage this scale of data.
BTW: I wouldn't use this as a reason to pick either an OO language or not. | 0 | 3,461 | false | 0 | 1 | What is the object oriented programming computing overhead cost? | 372,566 |
11 | 14 | 0 | 0 | 2 | 1 | 0 | 0 | I have a large set of data (a data cube of 250,000 X 1,000 doubles, about a 4 gig file) and I want to manipulate it using a previous set of OOP classes I have written in Python. Currently the data set is already so large that to read into my machine memory I have to at least split it in half so computing overhead is a concern. My OOP classes create new objects (in this case I will need 250,000 new objects, each object is an array of 1,000 doubles) to handle the data. What is the overhead in terms of memory and computing required in creating objects for a generic OOP language? In python? What about in C++?
Yes, I realize I could make a new class that is an array. But 1) I already have these classes finished and 2) I put each object that I create back into an array for access later anyways. The question is pedagogical
*update: I want to be efficient with time, my time and the computers. I don't want to rewrite a program I already have if I don't have to and spending time optimizing the code wastes my time, I don't care that much if I waste the computers time. I actually do have a 64bit machine with 4Gig ram. The data is an image and I need to do several filters on each pixel.* | 0 | python,oop,data-analysis | 2008-12-16T20:04:00.000 | 0 | 372,511 | The "overhead" depends largely on the platform and the implementation you chose.
Now if you have a memory problem reading millions of data from a multiple Gb file, you're having an algorithm problem where the memory consumption of objects is definitely not the biggest concern, the concern yould be more about how you do fetch, process and store the data. | 0 | 3,461 | false | 0 | 1 | What is the object oriented programming computing overhead cost? | 372,595 |
11 | 14 | 0 | 0 | 2 | 1 | 0 | 0 | I have a large set of data (a data cube of 250,000 X 1,000 doubles, about a 4 gig file) and I want to manipulate it using a previous set of OOP classes I have written in Python. Currently the data set is already so large that to read into my machine memory I have to at least split it in half so computing overhead is a concern. My OOP classes create new objects (in this case I will need 250,000 new objects, each object is an array of 1,000 doubles) to handle the data. What is the overhead in terms of memory and computing required in creating objects for a generic OOP language? In python? What about in C++?
Yes, I realize I could make a new class that is an array. But 1) I already have these classes finished and 2) I put each object that I create back into an array for access later anyways. The question is pedagogical
*update: I want to be efficient with time, my time and the computers. I don't want to rewrite a program I already have if I don't have to and spending time optimizing the code wastes my time, I don't care that much if I waste the computers time. I actually do have a 64bit machine with 4Gig ram. The data is an image and I need to do several filters on each pixel.* | 0 | python,oop,data-analysis | 2008-12-16T20:04:00.000 | 0 | 372,511 | compared to the size of your data set, the overhead of 250K objects is negligible
i think you're on the wrong path; don't blame objects for that ;-) | 0 | 3,461 | false | 0 | 1 | What is the object oriented programming computing overhead cost? | 372,706 |
1 | 9 | 0 | 0 | 23 | 0 | 0 | 1 | I'm using python and I need to map locations like "Bloomington, IN" to GPS coordinates so I can measure distances between them.
What Geocoding libraries/APIs do you recommend? Solutions in other languages are also welcome. | 0 | python,api,rest,geocoding | 2008-12-17T01:19:00.000 | 0 | 373,383 | You can have better look in Geopy module.And it is worth enough to use as it contains Google map, yahoo map geocoders with which you can implement geocodings. | 0 | 13,754 | false | 0 | 1 | Geocoding libraries | 2,229,732 |
1 | 1 | 0 | 2 | 7 | 0 | 0.379949 | 0 | Beyond offering an API for my website, I'd like to offer users the ability to write simple scripts that would run on my servers . The scripts would have access to objects owned by the user and be able to manipulate, modify, and otherwise process their data.
I'd like to be able to limit resources taken by these scripts at a fine level (eg. max execution time should be 100ms). I'd also like to ensure a secure sandbox such that each user will have access to only a limited set of data and resources, and be prevented from accessing disk, other people's data, etc.
Generally the scripts will be very simple (eg. create the sum or average of the values that match certain criteria), and they'll often be used in templates (eg. fill in the value of this cell or html element with the average or sum).
Ideally I'd like to use a sandboxed subset of a well know, commonly available programming language so it's easy for users to pick up. The backend is written in Python, so a Python based language could have benefits, but I'm open to other languages and technologies. Javascript is also attractive due to its simple nature and common availability.
The languages should support creation of DSLs and libraries.
The target audience is a general user base for a web based application, not necessarily very technical. In other words, it's not targeted at a base with particular knowledge of any particular programming language. My expectation is a subset of users will create scripts that will be used by the larger majority.
Any ideas or recommendations for the language and technology? Any examples of others trying this and the successes and failures they encountered? | 0 | javascript,python,sandbox | 2008-12-17T01:38:00.000 | 0 | 373,406 | I use Lua for this, but it's directed at a Lua capable community. So my answer would be who are your users?
If your users are internal, like my case, and proficient with Python use Python. However if this is something for the world wide web, I'd probably choose javascript, because its the lingua franca, (every developer knows it, and its easy to pickup). As for an Engine... well V8 would be nice, but its not 100% thread safe, in that you can't run several engine within the same process in a lock free manner, as you can with SpiderMonkey. So You might want to use that. Also since javascript is sandboxed by default you won't have to worry about implementing much on your side. | 0 | 421 | false | 1 | 1 | Secure, sandboxable user exposed programming language / environment? | 373,415 |
3 | 5 | 0 | 1 | 18 | 1 | 0.039979 | 0 | In Python properties are used instead of the Java-style getters, setters. So one rarely sees get... or set.. methods in the public interfaces of classes.
But in cases were a property is not appropriate one might still end up with methods that behave like getters or setters. Now my questions: Should these method names start with get_ / set_? Or is this unpythonic vebosity since it is often obvious what is meant (and one can still use the docstring to clarify non-obvious situations)?
This might be a matter of personal taste, but I would be interested in what the majority thinks about this? What would you prefer as an API user?
Example: Say we have an object representing multiple cities. One might have a method get_city_by_postalcode(postalcode) or one could use the shorter name city_by_postalcode. I tend towards the later. | 0 | python,coding-style | 2008-12-17T14:49:00.000 | 0 | 374,763 | I've seen it done both ways. Coming from an Objective-C background, I usually do foo()/set_foo() if I can't use a property (although I try to use properties whenever possible). It doesn't really matter that much, though, as long as you're consistent.
(Of course, in your example, I wouldn't call the method get_city_by_postalcode() at all; I'd probably go with translate_postalcode or something similar that uses a better action verb in the name.) | 0 | 4,305 | false | 0 | 1 | Should I use get_/set_ prefixes in Python method names? | 374,860 |
3 | 5 | 0 | 0 | 18 | 1 | 0 | 0 | In Python properties are used instead of the Java-style getters, setters. So one rarely sees get... or set.. methods in the public interfaces of classes.
But in cases were a property is not appropriate one might still end up with methods that behave like getters or setters. Now my questions: Should these method names start with get_ / set_? Or is this unpythonic vebosity since it is often obvious what is meant (and one can still use the docstring to clarify non-obvious situations)?
This might be a matter of personal taste, but I would be interested in what the majority thinks about this? What would you prefer as an API user?
Example: Say we have an object representing multiple cities. One might have a method get_city_by_postalcode(postalcode) or one could use the shorter name city_by_postalcode. I tend towards the later. | 0 | python,coding-style | 2008-12-17T14:49:00.000 | 0 | 374,763 | If I have to use a getter/setter, I like it this way:
Suppose you have a variable self._x. Then x() would return the value of self._x, and setX(x) would set the value of self._x | 0 | 4,305 | false | 0 | 1 | Should I use get_/set_ prefixes in Python method names? | 375,661 |
3 | 5 | 0 | 4 | 18 | 1 | 1.2 | 0 | In Python properties are used instead of the Java-style getters, setters. So one rarely sees get... or set.. methods in the public interfaces of classes.
But in cases were a property is not appropriate one might still end up with methods that behave like getters or setters. Now my questions: Should these method names start with get_ / set_? Or is this unpythonic vebosity since it is often obvious what is meant (and one can still use the docstring to clarify non-obvious situations)?
This might be a matter of personal taste, but I would be interested in what the majority thinks about this? What would you prefer as an API user?
Example: Say we have an object representing multiple cities. One might have a method get_city_by_postalcode(postalcode) or one could use the shorter name city_by_postalcode. I tend towards the later. | 0 | python,coding-style | 2008-12-17T14:49:00.000 | 0 | 374,763 | I think shorter is better, so I tend to prefer the later. But what's important is to consistent with your project: don't mix the two methods. If you jump into someone else's project, keep what the other developers chose initially. | 0 | 4,305 | true | 0 | 1 | Should I use get_/set_ prefixes in Python method names? | 374,856 |
4 | 4 | 0 | 0 | 4 | 0 | 0 | 0 | We have a ftp system setup to monitor/download from remote ftp servers that are not under our control. The script connects to the remote ftp, and grabs the file names of files on the server, we then check to see if its something that has already been downloaded. If it hasn't been downloaded then we download the file and add it to the list.
We recently ran into an issue, where someone on the remote ftp side, will copy in a massive single file(>1GB) then the script will wake up see a new file and begin downloading the file that is being copied in.
What is the best way to check this? I was thinking of grabbing the file size waiting a few seconds checking the file size again and see if it has increased, if it hasn't then we download it. But since time is of the concern, we can't wait a few seconds for every single file set and see if it's file size has increased.
What would be the best way to go about this, currently everything is done via pythons ftplib, how can we do this aside from using the aforementioned method.
Yet again let me reiterate this, we have 0 control over the remote ftp sites.
Thanks.
UPDATE1:
I was thinking what if i tried to rename it... since we have full permissions on the ftp, if the file upload is in progress would the rename command fail?
We don't have any real options here... do we?
UPDATE2:
Well here's something interesting some of the ftps we tested on appear to automatically allocate the space once the transfer starts.
E.g. If i transfer a 200mb file to the ftp server. While the transfer is active if i connect to the ftp server and do a size while the upload is happening. It shows 200mb for the size. Even though the file is only like 10% complete.
Permissions also seem to be randomly set the FTP Server that comes with IIS sets the permissions AFTER the file is finished copying. While some of the other older ftp servers set it as soon as you send the file.
:'( | 0 | python,ftp,ftplib | 2008-12-17T18:54:00.000 | 1 | 375,620 | You can't know when the OS copy is done. It could slow down or wait.
For absolute certainty, you really need two files.
The massive file.
And a tiny trigger file.
They can mess with the massive file all they want. But when they touch the trigger file, you're downloading both.
If you can't get a trigger, you have to balance the time required to poll vs. the time required to download.
Do this.
Get a listing. Check timestamps.
Check sizes vs. previous size of file. If size isn't even close, it's being copied right now. Wait; loop on this step until size is close to previous size.
While you're not done:
a. Get the file.
b. Get a listing AGAIN. Check the size of the new listing, previous listing and your file. If they agree: you're done. If they don't agree: file changed while you were downloading; you're not done. | 0 | 1,585 | false | 0 | 1 | Prevent ftplib from Downloading a File in Progress? | 375,650 |
4 | 4 | 0 | 5 | 4 | 0 | 1.2 | 0 | We have a ftp system setup to monitor/download from remote ftp servers that are not under our control. The script connects to the remote ftp, and grabs the file names of files on the server, we then check to see if its something that has already been downloaded. If it hasn't been downloaded then we download the file and add it to the list.
We recently ran into an issue, where someone on the remote ftp side, will copy in a massive single file(>1GB) then the script will wake up see a new file and begin downloading the file that is being copied in.
What is the best way to check this? I was thinking of grabbing the file size waiting a few seconds checking the file size again and see if it has increased, if it hasn't then we download it. But since time is of the concern, we can't wait a few seconds for every single file set and see if it's file size has increased.
What would be the best way to go about this, currently everything is done via pythons ftplib, how can we do this aside from using the aforementioned method.
Yet again let me reiterate this, we have 0 control over the remote ftp sites.
Thanks.
UPDATE1:
I was thinking what if i tried to rename it... since we have full permissions on the ftp, if the file upload is in progress would the rename command fail?
We don't have any real options here... do we?
UPDATE2:
Well here's something interesting some of the ftps we tested on appear to automatically allocate the space once the transfer starts.
E.g. If i transfer a 200mb file to the ftp server. While the transfer is active if i connect to the ftp server and do a size while the upload is happening. It shows 200mb for the size. Even though the file is only like 10% complete.
Permissions also seem to be randomly set the FTP Server that comes with IIS sets the permissions AFTER the file is finished copying. While some of the other older ftp servers set it as soon as you send the file.
:'( | 0 | python,ftp,ftplib | 2008-12-17T18:54:00.000 | 1 | 375,620 | “Damn the torpedoes! Full speed ahead!”
Just download the file. If it is a large file then after the download completes wait as long as is reasonable for your scenario and continue the download from the point it stopped. Repeat until there is no more stuff to download. | 0 | 1,585 | true | 0 | 1 | Prevent ftplib from Downloading a File in Progress? | 375,800 |
4 | 4 | 0 | 0 | 4 | 0 | 0 | 0 | We have a ftp system setup to monitor/download from remote ftp servers that are not under our control. The script connects to the remote ftp, and grabs the file names of files on the server, we then check to see if its something that has already been downloaded. If it hasn't been downloaded then we download the file and add it to the list.
We recently ran into an issue, where someone on the remote ftp side, will copy in a massive single file(>1GB) then the script will wake up see a new file and begin downloading the file that is being copied in.
What is the best way to check this? I was thinking of grabbing the file size waiting a few seconds checking the file size again and see if it has increased, if it hasn't then we download it. But since time is of the concern, we can't wait a few seconds for every single file set and see if it's file size has increased.
What would be the best way to go about this, currently everything is done via pythons ftplib, how can we do this aside from using the aforementioned method.
Yet again let me reiterate this, we have 0 control over the remote ftp sites.
Thanks.
UPDATE1:
I was thinking what if i tried to rename it... since we have full permissions on the ftp, if the file upload is in progress would the rename command fail?
We don't have any real options here... do we?
UPDATE2:
Well here's something interesting some of the ftps we tested on appear to automatically allocate the space once the transfer starts.
E.g. If i transfer a 200mb file to the ftp server. While the transfer is active if i connect to the ftp server and do a size while the upload is happening. It shows 200mb for the size. Even though the file is only like 10% complete.
Permissions also seem to be randomly set the FTP Server that comes with IIS sets the permissions AFTER the file is finished copying. While some of the other older ftp servers set it as soon as you send the file.
:'( | 0 | python,ftp,ftplib | 2008-12-17T18:54:00.000 | 1 | 375,620 | If you are dealing with multiple files, you could get the list of all the sizes at once, wait ten seconds, and see which are the same. Whichever are still the same should be safe to download. | 0 | 1,585 | false | 0 | 1 | Prevent ftplib from Downloading a File in Progress? | 375,716 |
4 | 4 | 0 | 0 | 4 | 0 | 0 | 0 | We have a ftp system setup to monitor/download from remote ftp servers that are not under our control. The script connects to the remote ftp, and grabs the file names of files on the server, we then check to see if its something that has already been downloaded. If it hasn't been downloaded then we download the file and add it to the list.
We recently ran into an issue, where someone on the remote ftp side, will copy in a massive single file(>1GB) then the script will wake up see a new file and begin downloading the file that is being copied in.
What is the best way to check this? I was thinking of grabbing the file size waiting a few seconds checking the file size again and see if it has increased, if it hasn't then we download it. But since time is of the concern, we can't wait a few seconds for every single file set and see if it's file size has increased.
What would be the best way to go about this, currently everything is done via pythons ftplib, how can we do this aside from using the aforementioned method.
Yet again let me reiterate this, we have 0 control over the remote ftp sites.
Thanks.
UPDATE1:
I was thinking what if i tried to rename it... since we have full permissions on the ftp, if the file upload is in progress would the rename command fail?
We don't have any real options here... do we?
UPDATE2:
Well here's something interesting some of the ftps we tested on appear to automatically allocate the space once the transfer starts.
E.g. If i transfer a 200mb file to the ftp server. While the transfer is active if i connect to the ftp server and do a size while the upload is happening. It shows 200mb for the size. Even though the file is only like 10% complete.
Permissions also seem to be randomly set the FTP Server that comes with IIS sets the permissions AFTER the file is finished copying. While some of the other older ftp servers set it as soon as you send the file.
:'( | 0 | python,ftp,ftplib | 2008-12-17T18:54:00.000 | 1 | 375,620 | As you say you have 0 control over the servers and can't make your clients post trigger files as suggested by S. Lott, you must deal with the imperfect solution and risk incomplete file transmission, perhaps by waiting for a while and compare file sizes before and after.
You can try to rename as you suggested, but as you have 0 control you can't be sure that the ftp-server-administrator (or their successor) doesn't change platforms or ftp servers or restricts your permissions.
Sorry. | 0 | 1,585 | false | 0 | 1 | Prevent ftplib from Downloading a File in Progress? | 375,705 |
1 | 15 | 0 | 0 | 337 | 0 | 0 | 0 | In Python, is there a portable and simple way to test if an executable program exists?
By simple I mean something like the which command which would be just perfect. I don't want to search PATH manually or something involving trying to execute it with Popen & al and see if it fails (that's what I'm doing now, but imagine it's launchmissiles) | 0 | python,path | 2008-12-18T05:55:00.000 | 1 | 377,017 | So basically you want to find a file in mounted filesystem (not necessarily in PATH directories only) and check if it is executable. This translates to following plan:
enumerate all files in locally mounted filesystems
match results with name pattern
for each file found check if it is executable
I'd say, doing this in a portable way will require lots of computing power and time. Is it really what you need? | 0 | 167,325 | false | 0 | 1 | Test if executable exists in Python? | 377,590 |
1 | 1 | 0 | 3 | 1 | 1 | 1.2 | 0 | I have some old python code that uses the pywin32 extensions. Starting out with .net, I would like to port it to ironpython.
The old python code uses things like pythoncom.com_error, pywintypes.Time and interfaces a COM module that implements the IDispatch interface.
Does the .net libraries of ironpython have all I need for communicating with the COM module?
Specifically, does it have something to replace com_error and Time?
Thanks. | 0 | .net,com,ironpython | 2008-12-21T18:22:00.000 | 0 | 384,761 | Answering my own post.. :-)
com_error may be replaced by System.Runtime.InteropServices.COMException
The pywintypes.Time may be replaced by System.DateTime, (DATE in IDispatch interface)
Still, if anybody knows about any good documentation on IronPython, COM interoperability and moving from pywin32 to .net, please respond.. | 0 | 887 | true | 0 | 1 | Does ironpython have libraries that replace the pywin32 extensions? | 385,013 |
2 | 5 | 0 | 1 | 10 | 1 | 0.039979 | 0 | I come from classes object orientation languages and recently I have been learning those fancy dynamic languages (JavaScript, Python and Lua) and I want some tips about how to use OO in those languages. It would be useful to know the pitfalls and the shortcomings of such approach and the advantages compared to traditional OO.
The general notion that I got is that prototype based OO is basically programming with objects but no standard on how to use them whereas in normal OO there is a fixed predefined way to make and use objects.
In summary, what is the good, the bad and the ugly parts of such approach? | 0 | javascript,python,language-agnostic,lua,oop | 2008-12-22T02:25:00.000 | 0 | 385,403 | Classical inheritance is inherently flawed in terms of flexibility, in that we are saying "this object is of this type and no other". Some languages introduce multiple inheritance to alleviate this, but multiple inheritance has its own pitfalls, and so the benefits of pure composition over inheritance (which, in a statically typed language, is a runtime rather than a compile time mechanism) become clear.
Taking the concept of composition to this "pure" level, we can eliminate classical inheritance altogether along with static typing. By composing objects at runtime and using them as blueprints (the prototypal approach), we need never concern ourselves with boxing objects too tightly through inheritance, nor burden ourselves with the issues inherent in multiple inheritance approaches.
So prototypal means much more flexible development of modules.
Of course, it's quite another thing to say it's EASY to develop without static typing. IMO, it is not. | 0 | 4,736 | false | 0 | 1 | Prototype based object orientation. The good, the bad and the ugly? | 3,958,261 |
2 | 5 | 0 | 0 | 10 | 1 | 0 | 0 | I come from classes object orientation languages and recently I have been learning those fancy dynamic languages (JavaScript, Python and Lua) and I want some tips about how to use OO in those languages. It would be useful to know the pitfalls and the shortcomings of such approach and the advantages compared to traditional OO.
The general notion that I got is that prototype based OO is basically programming with objects but no standard on how to use them whereas in normal OO there is a fixed predefined way to make and use objects.
In summary, what is the good, the bad and the ugly parts of such approach? | 0 | javascript,python,language-agnostic,lua,oop | 2008-12-22T02:25:00.000 | 0 | 385,403 | Okay, first of all, the prototype model isn't all that different in reality; Smalltalk uses a similar sort of scheme; the class is an object with the classes methods.
Looked at from the class POV, a class is really the equivalence class of objects with the same data, and all the same methods; you can look at adding a method to a prototype as creating a new subclass.
The implementation is simple, but makes it very difficult to do effective typechecking. | 0 | 4,736 | false | 0 | 1 | Prototype based object orientation. The good, the bad and the ugly? | 385,417 |
14 | 19 | 0 | 2 | 47 | 1 | 0.02105 | 0 | Has anyone ever had code in Python, that turned out not to perform fast enough?
I mean, you were forced to choose another language because of it?
We are investigating using Python for a couple of larger projects, and my feeling is that in most cases, Python is plenty fast enough for most scenarios (compared to say, Java) because it relies on optimized C routines.
I wanted to see if people had instances where they started out in Python, but ended up having to go with something else because of performance.
Thanks. | 0 | python,performance,optimization,rewrite | 2008-12-22T16:23:00.000 | 0 | 386,655 | I used to prototype lots of things in python for doing things like log processing. When they didn't run fast enough, I'd rewrite them in ocaml.
In many cases, the python was fine and I was happy with it. In some cases, as it started approaching 23 hours to do a days' logs, I'd get to rewriting. :)
I would like to point out that even in those cases, I may have been better off just profiling the python code and finding a happier python implementation. | 0 | 8,929 | false | 0 | 1 | Python Performance - have you ever had to rewrite in something else? | 386,689 |
14 | 19 | 0 | 19 | 47 | 1 | 1 | 0 | Has anyone ever had code in Python, that turned out not to perform fast enough?
I mean, you were forced to choose another language because of it?
We are investigating using Python for a couple of larger projects, and my feeling is that in most cases, Python is plenty fast enough for most scenarios (compared to say, Java) because it relies on optimized C routines.
I wanted to see if people had instances where they started out in Python, but ended up having to go with something else because of performance.
Thanks. | 0 | python,performance,optimization,rewrite | 2008-12-22T16:23:00.000 | 0 | 386,655 | This is a much more difficult question to answer than people are willing to admit.
For example, it may be that I am able to write a program that performs better in Python than it does in C. The fallacious conclusion from that statement is "Python is therefore faster than C". In reality, it may be because I have much more recent experience in Python and its best practices and standard libraries.
In fact no one can really answer your question unless they are certain that they can create an optimal solution in both languages, which is unlikely. In other words "My C solution was faster than my Python solution" is not the same as "C is faster than Python"
I'm willing to bet that Guido Van Rossum could have written Python solutions for adam and Dustin's problems that performed quite well.
My rule of thumb is that unless you are writing the sort of application that requires you to count clock cycles, you can probably achieve acceptable performance in Python. | 0 | 8,929 | false | 0 | 1 | Python Performance - have you ever had to rewrite in something else? | 386,702 |
14 | 19 | 0 | 1 | 47 | 1 | 0.010526 | 0 | Has anyone ever had code in Python, that turned out not to perform fast enough?
I mean, you were forced to choose another language because of it?
We are investigating using Python for a couple of larger projects, and my feeling is that in most cases, Python is plenty fast enough for most scenarios (compared to say, Java) because it relies on optimized C routines.
I wanted to see if people had instances where they started out in Python, but ended up having to go with something else because of performance.
Thanks. | 0 | python,performance,optimization,rewrite | 2008-12-22T16:23:00.000 | 0 | 386,655 | A month ago i had this little program written in Python (for work) that analyzes logs.
When then number of log files grew, the program begun to be very slow and i thought i could rewrite it in Java.
I was very interesting. It took a whole day to migrate the same algorithm from Python to Java. At the end of the day, a few benchmark trials showed me clearly that the Java program was some 20% / 25% slower than its Python counterpart. It was a surprise to me.
Writing for the second time the algorithm also showed me that some optimization was possible. So in two hours i completely rewrote the whole thing in Python and it was some 40% faster than the original Python implementation (hence orders of time faster than the Java version I had).
So:
Python is a slow language but still it can be faster, for certain tasks, that other supposedly faster languages
If you have to spend time writing something in a language whose execution is faster but whose development time is slower (most languages), consider using the same time to analyze the problem, search for libraries, profile and then write better Python code. | 0 | 8,929 | false | 0 | 1 | Python Performance - have you ever had to rewrite in something else? | 393,185 |
14 | 19 | 0 | 2 | 47 | 1 | 0.02105 | 0 | Has anyone ever had code in Python, that turned out not to perform fast enough?
I mean, you were forced to choose another language because of it?
We are investigating using Python for a couple of larger projects, and my feeling is that in most cases, Python is plenty fast enough for most scenarios (compared to say, Java) because it relies on optimized C routines.
I wanted to see if people had instances where they started out in Python, but ended up having to go with something else because of performance.
Thanks. | 0 | python,performance,optimization,rewrite | 2008-12-22T16:23:00.000 | 0 | 386,655 | You can always write parts of your application in Python. Not every component is equally important for performance. Python integrates easily with C++ natively, or with Java via Jython, or with .NET via IronPython.
By the way, IronPython is more efficient than the C implementation of Python on some benchmarks. | 0 | 8,929 | false | 0 | 1 | Python Performance - have you ever had to rewrite in something else? | 386,706 |
14 | 19 | 0 | 7 | 47 | 1 | 1 | 0 | Has anyone ever had code in Python, that turned out not to perform fast enough?
I mean, you were forced to choose another language because of it?
We are investigating using Python for a couple of larger projects, and my feeling is that in most cases, Python is plenty fast enough for most scenarios (compared to say, Java) because it relies on optimized C routines.
I wanted to see if people had instances where they started out in Python, but ended up having to go with something else because of performance.
Thanks. | 0 | python,performance,optimization,rewrite | 2008-12-22T16:23:00.000 | 0 | 386,655 | Not so far. I work for a company that has a molecular simulation engine and a bunch of programs written in python for processing the large multi-gigabyte datasets. All of our analysis software is now being written in Python because of the huge advantages in development flexibility and time.
If something is not fast enough we profile it with cProfile and find the bottlenecks. Usually there are one or two functions which take up 80 or 90% of the runtime. We then take those functions and rewrite them in C, something which Python makes dead easy with its C API. In many cases this results in an order of magnitude or more speedup. Problem gone. We then go on our merry way continuing to write everything else in Python. Rinse and repeat...
For entire modules or classes we tend to use Boost.python, it can be a bit of a bear but ultimately works well. If it's just a function or two, we sometimes inline it with scipy.weave if the project is already using scipy. | 0 | 8,929 | false | 0 | 1 | Python Performance - have you ever had to rewrite in something else? | 386,999 |
14 | 19 | 0 | 7 | 47 | 1 | 1 | 0 | Has anyone ever had code in Python, that turned out not to perform fast enough?
I mean, you were forced to choose another language because of it?
We are investigating using Python for a couple of larger projects, and my feeling is that in most cases, Python is plenty fast enough for most scenarios (compared to say, Java) because it relies on optimized C routines.
I wanted to see if people had instances where they started out in Python, but ended up having to go with something else because of performance.
Thanks. | 0 | python,performance,optimization,rewrite | 2008-12-22T16:23:00.000 | 0 | 386,655 | While at uni we were writing a computer vision system for analysing human behaviour based on video clips. We used python because of the excellent PIL, to speed up development and let us get easy access to the image frames we'd extracted from the video for converting to arrays etc.
For 90% of what we wanted it was fine and since the images were reasonably low resolution the speed wasn't bad. However, a few of the processes required some complex pixel-by-pixel computations as well as convolutions which are notoriously slow. For these particular areas we re-wrote the innermost parts of the loops in C and just updated the old Python functions to call the C functions.
This gave us the best of both worlds. We had the ease of data access that python provides, which enabled to develop fast, and then the straight-line speed of C for the most intensive computations. | 0 | 8,929 | false | 0 | 1 | Python Performance - have you ever had to rewrite in something else? | 386,770 |
14 | 19 | 0 | 2 | 47 | 1 | 0.02105 | 0 | Has anyone ever had code in Python, that turned out not to perform fast enough?
I mean, you were forced to choose another language because of it?
We are investigating using Python for a couple of larger projects, and my feeling is that in most cases, Python is plenty fast enough for most scenarios (compared to say, Java) because it relies on optimized C routines.
I wanted to see if people had instances where they started out in Python, but ended up having to go with something else because of performance.
Thanks. | 0 | python,performance,optimization,rewrite | 2008-12-22T16:23:00.000 | 0 | 386,655 | I've been working for a while now, developing an application that operate on large structured data, stored in several-gigabytes-thick-database and well, Python is good enough for that. The application has GUI client with a multitude of controls (lists, trees, notebooks, virtual lists and more), and a console server.
We had some performance issues, but those were mostly related more to poor algorithm design or database engine limitations (we use Oracle, MS-SQL, MySQL and had short romance with BerkeleyDB used for speed optimizations) than to Python itself. Once you know how to use standard libraries (written in C) properly you can make your code really quick.
As others say - any computation intensive algorithm, code that depends on bit-stuffing, some memory constrained computation - can be done in raw C/C++ for CPU/memory saving (or any other tricks), but the whole user interaction, logging, database handling, error handling - all that makes the application actually run, can be written in Python and it will maintain responsiveness and decent overall performance. | 0 | 8,929 | false | 0 | 1 | Python Performance - have you ever had to rewrite in something else? | 386,752 |
14 | 19 | 0 | 0 | 47 | 1 | 0 | 0 | Has anyone ever had code in Python, that turned out not to perform fast enough?
I mean, you were forced to choose another language because of it?
We are investigating using Python for a couple of larger projects, and my feeling is that in most cases, Python is plenty fast enough for most scenarios (compared to say, Java) because it relies on optimized C routines.
I wanted to see if people had instances where they started out in Python, but ended up having to go with something else because of performance.
Thanks. | 0 | python,performance,optimization,rewrite | 2008-12-22T16:23:00.000 | 0 | 386,655 | I generally don't rewrite to C before I :
profile
rewrite with bette algorithms (generally this is enough)
rewrite python code with low level performance in mind (but never to the point of having non pythonic / non readable code)
spend some time rechecking a library cannot do this (first in stdlib, or an external lib)
tried psyco / other implementations (rarely achieves to get a REAL speed boost in my case)
Then sometimes I created a shared library to do heavy matrix computation code (which couldn't be done with numarray) and called it with ctypes :
simple to write/build/test a .so / dll in pure C,
simple to encapsulate the C to python function (ie. you don't if you use basic datatypes since ctypes does all the work of calling the right arguments for you) and certainly fast enough then . | 0 | 8,929 | false | 0 | 1 | Python Performance - have you ever had to rewrite in something else? | 1,900,043 |
14 | 19 | 0 | 2 | 47 | 1 | 0.02105 | 0 | Has anyone ever had code in Python, that turned out not to perform fast enough?
I mean, you were forced to choose another language because of it?
We are investigating using Python for a couple of larger projects, and my feeling is that in most cases, Python is plenty fast enough for most scenarios (compared to say, Java) because it relies on optimized C routines.
I wanted to see if people had instances where they started out in Python, but ended up having to go with something else because of performance.
Thanks. | 0 | python,performance,optimization,rewrite | 2008-12-22T16:23:00.000 | 0 | 386,655 | Yes, twice:
An audio DSP application I wound up completely rewriting in C++ because I couldn't get appropriate performance in Python; I don't consider the Python implementation wasted because it let me prototype the concept very easily, and the C++ port went smoothly because I had a working reference implementaton.
A procedural graphic rendering project, where generating large 2D texture maps in Python was taking a long time; I wrote a C++ DLL and used ctypes/windll to use it from Python. | 0 | 8,929 | false | 0 | 1 | Python Performance - have you ever had to rewrite in something else? | 3,122,149 |
14 | 19 | 0 | 1 | 47 | 1 | 0.010526 | 0 | Has anyone ever had code in Python, that turned out not to perform fast enough?
I mean, you were forced to choose another language because of it?
We are investigating using Python for a couple of larger projects, and my feeling is that in most cases, Python is plenty fast enough for most scenarios (compared to say, Java) because it relies on optimized C routines.
I wanted to see if people had instances where they started out in Python, but ended up having to go with something else because of performance.
Thanks. | 0 | python,performance,optimization,rewrite | 2008-12-22T16:23:00.000 | 0 | 386,655 | No, I've never had to rewrite. In fact, I started using Python in Maya 8.5. Before Maya 8, the only scripting language available was the built in MEL (Maya Expression Language). Python is actually faster than the built in language that it wraps.
Python's ability to work with complex data types also made it faster because MEL can only store single dimensional arrays (and no pointers). This would require multi-dimensional arrays be faked by either using multiple parallel arrays, or by using slow string concatenation. | 0 | 8,929 | false | 0 | 1 | Python Performance - have you ever had to rewrite in something else? | 387,988 |
14 | 19 | 0 | 2 | 47 | 1 | 0.02105 | 0 | Has anyone ever had code in Python, that turned out not to perform fast enough?
I mean, you were forced to choose another language because of it?
We are investigating using Python for a couple of larger projects, and my feeling is that in most cases, Python is plenty fast enough for most scenarios (compared to say, Java) because it relies on optimized C routines.
I wanted to see if people had instances where they started out in Python, but ended up having to go with something else because of performance.
Thanks. | 0 | python,performance,optimization,rewrite | 2008-12-22T16:23:00.000 | 0 | 386,655 | I am developing in python for several years now. Recently i had to list all files in a directory and build a struct with filename, size, attributes and modification date. I did this with os.listdir and os.stat. The code was quite fast, but the more entries in the directories, the slower my code became comapred to other filemanagers listing the same directory, so i rewrote the code using SWIG/C++ and was really surprised how much faster the code was. | 0 | 8,929 | false | 0 | 1 | Python Performance - have you ever had to rewrite in something else? | 501,942 |
14 | 19 | 0 | 1 | 47 | 1 | 0.010526 | 0 | Has anyone ever had code in Python, that turned out not to perform fast enough?
I mean, you were forced to choose another language because of it?
We are investigating using Python for a couple of larger projects, and my feeling is that in most cases, Python is plenty fast enough for most scenarios (compared to say, Java) because it relies on optimized C routines.
I wanted to see if people had instances where they started out in Python, but ended up having to go with something else because of performance.
Thanks. | 0 | python,performance,optimization,rewrite | 2008-12-22T16:23:00.000 | 0 | 386,655 | I once had to write a pseudo-random number generator for a simulator. I wrote it in Python first, but Python proved to be way too slow; I ended up rewriting it in C, and even that was slow, but not nearly as slow as Python.
Luckily, it's fairly easy to bridge Python and C, so I was able to write the PRNG as a C module and still write the rest of the simulator in Python. | 0 | 8,929 | false | 0 | 1 | Python Performance - have you ever had to rewrite in something else? | 393,200 |
14 | 19 | 0 | 4 | 47 | 1 | 0.04208 | 0 | Has anyone ever had code in Python, that turned out not to perform fast enough?
I mean, you were forced to choose another language because of it?
We are investigating using Python for a couple of larger projects, and my feeling is that in most cases, Python is plenty fast enough for most scenarios (compared to say, Java) because it relies on optimized C routines.
I wanted to see if people had instances where they started out in Python, but ended up having to go with something else because of performance.
Thanks. | 0 | python,performance,optimization,rewrite | 2008-12-22T16:23:00.000 | 0 | 386,655 | This kind of question is likely to start a religious war among language people so let me answer it a little bit differently.
For most cases in today's computing environments your choice of programming language should be based on what you can program efficiently, program well and what makes you happy not the performance characteristics of the language. Also, optimization should generally be the last concern when programming any system.
The typical python way to do things is to start out writing your program in python and then if you notice the performance suffering profile the application and attempt to optimize the hot-spots in python first. If optimizing the python code still isn't good enough the areas of the code that are weighing you down should be re-written as a python module in C. If even after all of that your program isn't fast enough you can either change languages or look at scaling up in hardware or concurrency.
That's the long answer, to answer your question directly; no, python (sometimes with C extensions) has been fast enough for everything I need it to do. The only time I really dip into C is to get access to stuff that donesn't have python bindings.
Edit: My background is a python programmer at a large .com where we use python for everything from the front-end of our websites all the way down to all the back-office systems. Python is very much an enterprise-grade language. | 0 | 8,929 | false | 0 | 1 | Python Performance - have you ever had to rewrite in something else? | 390,700 |
14 | 19 | 0 | 5 | 47 | 1 | 0.052583 | 0 | Has anyone ever had code in Python, that turned out not to perform fast enough?
I mean, you were forced to choose another language because of it?
We are investigating using Python for a couple of larger projects, and my feeling is that in most cases, Python is plenty fast enough for most scenarios (compared to say, Java) because it relies on optimized C routines.
I wanted to see if people had instances where they started out in Python, but ended up having to go with something else because of performance.
Thanks. | 0 | python,performance,optimization,rewrite | 2008-12-22T16:23:00.000 | 0 | 386,655 | Whenever I find a Python bottleneck, I rewrite that code in C as a Python module.
For example, I have some hardware that sends image pixels as 4-byte 0RGB. Converting 8M from 0RGB to RGB in Python takes too long so I rewrote it as a Python module.
Writing Python (or other higher level languages) is much faster than writing in C so I use Python until I can't. | 0 | 8,929 | false | 0 | 1 | Python Performance - have you ever had to rewrite in something else? | 386,909 |
1 | 6 | 0 | 1 | 41 | 0 | 0.033321 | 0 | So far I've found it impossible to produce usable tracebacks when Mako templates aren't coded correctly.
Is there any way to debug templates besides iterating for every line of code? | 0 | python,debugging,templates,jinja2,mako | 2008-12-23T23:43:00.000 | 0 | 390,409 | I break them down into pieces, and then reassemble the pieces when I've found the problem.
Not good, but it's really hard to tell what went wrong in a big, complex template. | 0 | 10,962 | false | 1 | 1 | How do you debug Mako templates? | 390,603 |
4 | 21 | 0 | 20 | 148 | 1 | 1 | 0 | I have a friend who likes to use metaclasses, and regularly offers them as a solution.
I am of the mind that you almost never need to use metaclasses. Why? because I figure if you are doing something like that to a class, you should probably be doing it to an object. And a small redesign/refactor is in order.
Being able to use metaclasses has caused a lot of people in a lot of places to use classes as some kind of second rate object, which just seems disastrous to me. Is programming to be replaced by meta-programming? The addition of class decorators has unfortunately made it even more acceptable.
So please, I am desperate to know your valid (concrete) use-cases for metaclasses in Python. Or to be enlightened as to why mutating classes is better than mutating objects, sometimes.
I will start:
Sometimes when using a third-party
library it is useful to be able to
mutate the class in a certain way.
(This is the only case I can think of, and it's not concrete) | 0 | python,metaclass | 2008-12-24T20:13:00.000 | 0 | 392,160 | Let's start with Tim Peter's classic quote:
Metaclasses are deeper magic than 99%
of users should ever worry about. If
you wonder whether you need them, you
don't (the people who actually need
them know with certainty that they
need them, and don't need an
explanation about why). Tim Peters
(c.l.p post 2002-12-22)
Having said that, I have (periodically) run across true uses of metaclasses. The one that comes to mind is in Django where all of your models inherit from models.Model. models.Model, in turn, does some serious magic to wrap your DB models with Django's ORM goodness. That magic happens by way of metaclasses. It creates all manner of exception classes, manager classes, etc. etc.
See django/db/models/base.py, class ModelBase() for the beginning of the story. | 0 | 32,950 | false | 0 | 1 | What are some (concrete) use-cases for metaclasses? | 392,442 |
4 | 21 | 0 | 8 | 148 | 1 | 1 | 0 | I have a friend who likes to use metaclasses, and regularly offers them as a solution.
I am of the mind that you almost never need to use metaclasses. Why? because I figure if you are doing something like that to a class, you should probably be doing it to an object. And a small redesign/refactor is in order.
Being able to use metaclasses has caused a lot of people in a lot of places to use classes as some kind of second rate object, which just seems disastrous to me. Is programming to be replaced by meta-programming? The addition of class decorators has unfortunately made it even more acceptable.
So please, I am desperate to know your valid (concrete) use-cases for metaclasses in Python. Or to be enlightened as to why mutating classes is better than mutating objects, sometimes.
I will start:
Sometimes when using a third-party
library it is useful to be able to
mutate the class in a certain way.
(This is the only case I can think of, and it's not concrete) | 0 | python,metaclass | 2008-12-24T20:13:00.000 | 0 | 392,160 | A reasonable pattern of metaclass use is doing something once when a class is defined rather than repeatedly whenever the same class is instantiated.
When multiple classes share the same special behaviour, repeating __metaclass__=X is obviously better than repeating the special purpose code and/or introducing ad-hoc shared superclasses.
But even with only one special class and no foreseeable extension, __new__ and __init__ of a metaclass are a cleaner way to initialize class variables or other global data than intermixing special-purpose code and normal def and class statements in the class definition body. | 0 | 32,950 | false | 0 | 1 | What are some (concrete) use-cases for metaclasses? | 7,057,480 |