The language needs to fit in our brains and you could argue that we've already pushed past that threshold
I would argue that "prefix just calls the registered function at import time" fits with my brain better then "20 special cases in the lexer which result in magic things happening", but maybe that's just me.
One concern I'd have is in how you would look up what some arbitrary prefix is supposed to do
In theory PyCharm/Vim/Emacs would be able to just jump to the definition of the prefix, instantly giving you the docstring along with the source code of what it does, so I don't think this is a real problem. Tool support will catch up, as always, and you could always grep "@str_prefix\ndef blah" or something similar.
Furthermore, it sounds like the intent is to apply the prefix at run-time.
Is there any other time in Python? There's only import-time and execution-time to choose from.
That means that the function that gets applied depends on what's in some registry at a given moment. So source with these custom string prefixes will be even more ambiguous when read.
In that case, why not just call the function at the module scope, bind the result to a name there, and use
I don't think the second line follows from the first. You could say the same of PEP302 import hooks, which overload a fundamental operation (import xxx) to do whatever the hell you want, depending on what's in your sys.meta_path at any given moment. Heck, you could say the same of any function call my_func(...), that it depends on what my_func has been defined as at a given moment! Remember functions can be (and often are!) rebound multiple times, but people seem to get by just fine. that name wherever you like. It's effectively a constant, right? The main reason I'd give is that moving a whole bunch of things into module scope spaghettifies your program. If I'm making a bunch of SQL queries in a bunch of functions, I expect the SQL to be in the functions where I'm using them (nobody else cares about them, right?) rather than at module scope. Moving stuff to module scope for performance adds another layer of unnecessary indirection when i look at my_function and wonder what my_function_sql_query and my_function_xml_template is meant to do, for every single function. This quickly degenerates into a "huge mess of things in global scope which i don't know do what" problem. Unless you're super disciplined with your naming conventions to allow you to see, instantly, what function uses what global. But that's the whole point of having a function-level scope in the first place! Exactly how annoying this is depends on how much "code in strings" you want to use. It may sound crazy, but I think Dropbox's Pyxl, Facebook's XHP and React show that it's pretty nice having external code (in this case HTML templates) localized to exactly the point of use, right in the source code. Rather than being forced to chuck it off into a whole separate top-level-declaration or (as it's usually done now) in a separate file. I'd argue that at least part of the reason external code templates tend to be large and clunky, rather than small and focused like Pyxl/XHP/React snippets tend to be, is exactly because of this problem: too many small, focused snippets means too many top level files or declarations results in too much indirection and spaghetti, so naturally people make their templates large to minimize the number of things lying around in your global scope.
But still, it's amazing how C++11-ish this discussion is getting. Which may be a good hint that (as you suggest) this feature isn't a good fit for Python.
I don't agree with this; while C++ is huge and terrible, C++11 actually has
some pretty good stuff (e.g. real lambdas, with real closures and real
lexical scoping! u"unicode" and r"raw" strings!). Dismissing something as
bad just because it's something C++11 has is a terrible idea and
immediately shuts out a whole range of interesting possibilities.
-Haoyi
On Thu, May 30, 2013 at 1:41 AM, Andrew Barnert
On May 29, 2013, at 22:21, Eric Snow
wrote: Now, give a twist to this idea of some stable `literal -> contstant` operation. Consider a new syntax or built-in function for indicating that, at compile-time, an expression should be considered equivalent to the literal to which it evaluates. Then the compiler would be free to substitute the literal for the expression and store the literal in the bytecode/constants/pyc file. From then on that expression would not be evaluated at run-time anymore. Of course, the expression would have to be entirely literal-based and evaluate to a literal.
So you're suggesting that instead of C++11-style string prefixes, we should have C++11-style constexpr.
I realize your ultimate conclusion was that we probably don't need _either_ feature. But still, it's amazing how C++11-ish this discussion is getting. Which may be a good hint that (as you suggest) this feature isn't a good fit for Python.
Unless someone has a way of doing it through compile-time templates, of course. :)
The catch is that the compiler would have to evaluate the expression, which would probably add disproportionate complexity to the compiler.
I don't think it's that bad; the compiler just has to make a call to a PyEval* function.
Also, given that your proposal is that it be explicitly an optional optimization that the compiler is free to ignore means there's an even more trivial implementation... _______________________________________________ Python-ideas mailing list Python-ideas@python.org http://mail.python.org/mailman/listinfo/python-ideas