Good day,
I got your email from the Installing Python Modules page in the Python 3.6.0 documentation.
I encountered an error when trying to install a package in the Python IDLE shell:
>>> python -m pip install numpy
SyntaxError: invalid syntax
I am using Windows 10.
I have not been able to find any solution to this. Could you please help? Alternatively, please direct me to a web group for help.
Many thanks.
Gan william
Hello,
I am beginner in Python I am facing problems in installing Python 3.5 on my windows vista x32 machine.I downloaded python-3.5.2.exe from Python.org. It is downloaded as an exe. When I try to install it via "Run as administrator" , nothing happens. Same behavior with 3.6 version
kindly advise
Regards & Thanks, Chitra Dewan
On Feb 14, 2017 12:21, "Vinay Sajip" <vinay_sajip(a)yahoo.co.uk> wrote:
> I thought the current status was that it's called metadata.json
> exactly *because* it's not standardized, and you *shouldn't* look at
> it?
Well, it was work-in-progress-standardised according to PEP 426 (since
sometimes implementations have to work in parallel with working out the
details of specifications). Given that PEP 426 wasn't done and dusted
but being progressed, I would have thought it perfectly acceptable to
use "pydist.json", as the only things that would be affected would be
packaging tools working to the PEP.
> It's too bad that the JSON thing didn't work out, but I think we're
> better off working on better specifying the one source of truth
> everything already uses (METADATA) instead of bringing in *new*
> partially-incompatible-and-poorly-specified formats.
When you say "everything already uses", do you mean setuptools and wheel?
If nobody else is allowed to play, that's one thing. But otherwise, there
need to be standards for interoperability. The METADATA file, now - exactly
which standard does it follow? The one in the dateutil wheel that Jim
referred to doesn't appear to conform to any of the metadata PEPs. It was
rejected by old metadata code in distlib (which came of out the Python 3.3
era "packaging" package - not to be confused with Donald's of the same name
-
which is strict in its interpretation of those earlier PEPs).
That's why I said we need to fix the standards to bring them back in sync
with reality. I'm not arguing that there's no problem, I'm saying that
replacing one serialization format with another won't actually address the
problem, but does cause new complications.
The METADATA format (key-value) is not really flexible enough for certain
things which were in PEP 426 (e.g. dependency descriptions), and for these
JSON seems a reasonable fit. There's no technical reason why "the JSON thing
didn't work out", as far as I can see - it was just given up on for a more
incremental approach (which has got no new PEPs other than 440, AFAICT). I
understand that social reasons are often more important than technical
reasons
when it comes to success or failure of an approach; I'm just not sure that
in this case, it wasn't given up on too early.
The technical problem with PEP 426 is that unless you want to throw away
pypi and start over, all tools need to understand the old METADATA files
regardless. So it still needs to be specified, all the same code needs to
be kept around, etc. Plus the most pressing issues are like "what does the
field actually mean", which is totally independent of the serialization
format.
If there are particular fields that need more structured data, then there
are options: we could have fields in METADATA whose values are JSON, or a
sidecar file that supplements the main METADATA file with extra
information. But adding a new way to specify fields like Name and Version
really doesn't help anybody.
-n
Hi Good Afternoon,
This is Venkat from HCL Technologies. Actually I have created executable file(test.exe) by using py2exe package on python 2.7 version on Windows.
After that I have ran my application from the path C:\Python27\dist\test.exe, It was executed and working properly.
But the problem is, when I have copied test.exe to other folder(other than "C:\Python27\dist\")and tried to run the test.exe, it is not executing.
Could you please help me in resolving the issue.
Thanks,
Venkat.
::DISCLAIMER::
----------------------------------------------------------------------------------------------------------------------------------------------------
The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
E-mail transmission is not guaranteed to be secure or error-free as information could be intercepted, corrupted,
lost, destroyed, arrive late or incomplete, or may contain viruses in transmission. The e mail and its contents
(with or without referred errors) shall therefore not attach any liability on the originator or HCL or its affiliates.
Views or opinions, if any, presented in this email are solely those of the author and may not necessarily reflect the
views or opinions of HCL or its affiliates. Any form of reproduction, dissemination, copying, disclosure, modification,
distribution and / or publication of this message without the prior written consent of authorized representative of
HCL is strictly prohibited. If you have received this email in error please delete it and notify the sender immediately.
Before opening any email and/or attachments, please check them for viruses and other defects.
----------------------------------------------------------------------------------------------------------------------------------------------------
I've just released zc.buildout 2.8.0 and the buildout.wheel extension.
If you have zc.buildout 2.8.0 or later, and you include:
extensions = buildout.wheel
In the buildout section of your buildout configuration, then buildout
should be able to install distributions as wheels.
This allowed me to install numpy using buildout, which wasn't possible
before.
This is a someone experimental version, which uses humpty to convert wheels
to eggs. humpty in term uses uses distlib which seems to mishandle wheel
metadata. (For example, it chokes if there's extra distribution meta and
makes it impossible for buildout to install python-dateutil from a wheel.)
Jim
--
Jim Fulton
http://jimfulton.info
Thanks for cc-ing me Steve.
I may be able to help jump-start this a bit and provide a platform for this to run on. I deployed a small service that scans PyPI to figure out statistics on Python 2 vs Python 3 support using PyPI Classifiers. The source is on GitHub: https://github.com/crwilcox/PyPI-Gatherer. It watches the PyPI updates feed and refreshes entries for packages as they show up as modified. It should be possible to add your lib, query, and add an additional row or two to the result. I am happy to work together on this. Also, the data is stored in an Azure Table Storage which has rest endpoints (and a Python SDK) that makes getting the published data straight-forward.
Here is an example of using the data provided by the service. This is a Jupyter Notebook analysing Python 3 Adoption: https://notebooks.azure.com/chris/libraries/pypidataanalysis
Thanks.
Chris
From: Steve Dower [mailto:steve.dower@python.org]
Sent: Tuesday, 7 February, 2017 6:39
To: Thomas Kluyver <thomas(a)kluyver.me.uk>; distutils-sig(a)python.org
Cc: Chris Wilcox <Christopher.Wilcox(a)microsoft.com>
Subject: RE: [Distutils] Indexing modules in Python distributions
I'm interested, and potentially in a position to provide funded infrastructure for this (though perhaps not as soon as you'd like, since things can move slowly at my end).
My personal preference would be to download a full list. This is slow moving data that will gzip nicely, and my uses (in IDE) will require many tentative queries. I can also see value in a single-query API, but keep it simple - the value here is in the data, not the lookup.
As far as updates go, most packaging systems should have some sort of release notification or update feed, so the work is likely going to be in hooking up to those and turning it into a scan task.
Cheers,
Steve
Top-posted from my Windows Phone
________________________________
From: Thomas Kluyver<mailto:thomas@kluyver.me.uk>
Sent: 2/7/2017 3:30
To: distutils-sig(a)python.org<mailto:distutils-sig@python.org>
Subject: [Distutils] Indexing modules in Python distributions
For a variety of reasons, I would like to build an index of what
modules/packages are contained in which distributions ('packages') on
PyPI. For instance:
- Identifying requirements by static analysis of code: 'import zmq' ->
requires pyzmq
- Finding corresponding packages from different packaging systems: pyzmq
on PyPI corresponds to pyzmq in conda, and python[3]-zmq in Debian
repositories. This is an oversimplification, but importable module names
provide a common basis to compare packages. I'd like a tool that could
pick between different ways of installing a given module.
People often assume that the import name is the same as the name on
PyPI. This is true in the vast majority of cases, but there's no
requirement that they are the same, and there are cases where they're
not - pyzmq is one example.
The metadata field 'Provides' is, according to PEP 314, intended for
this purpose, but the standard packaging tools don't make it easy to
use, and consequently very few packages specify it.
I have started putting together a tool to index wheels. It reads a .whl
file, finds modules inside it, and tries to identify namespace packages.
It's still quite rough, but it worked with the wheels I tried.
https://github.com/takluyver/wheeldex
Is this something that other people are interested in?
One thing I'm trying to work out at the moment is how the data would be
accessed: as a web service that tools can query online, or more like
Linux packaging, where tools download and cache a list to do lookups
locally. Or both? There's also, of course, the question of how the index
would be built and updated.
Thanks,
Thomas
_______________________________________________
Distutils-SIG maillist - Distutils-SIG(a)python.org<mailto:Distutils-SIG@python.org>
https://mail.python.org/mailman/listinfo/distutils-sig
Hello everyone,
I'm not sure this is the right place to write to propose new trove classifiers
for PyPi -- if it's not, what would be the right place? If this is it, then
please read below.
The MicroPython project is quickly growing and becoming more mature, and as
that happens, the number of 3rd-party libraries for it grows. Many of those
libraries get uploaded to PyPi, as you can check by searching for
"micropython". MicroPython has even its own version of "pip", called "upip",
that can be used to install those libraries.
However, there is as of yet no way to mark that a library is written for that
particular flavor of Python, as there are no trove classifiers for it. I would
like to propose adding a number of classifiers to amend that situation:
For the MicroPython itself:
Programming Language :: Python :: Implementation :: MicroPython
For the hardware it runs on:
Operating System :: Baremetal
Environment :: Microcontroller
Environment :: Microcontroller :: PyBoard
Environment :: Microcontroller :: ESP8266
Environment :: Microcontroller :: Micro:bit
Environment :: Microcontroller :: WiPy
Environment :: Microcontroller :: LoPy
Environment :: Microcontroller :: OpenMV
I'm not sure if the latter makes sense, but it would certainly be nice to be
able to indicate in a machine-parseable way on which platforms the code works.
What do you think?
--
Radomir Dopieralski
Hi,
I was invited to give a talk at PyCon Colombia 2017, and I did it on
packaging. I thought people here would be interested to know about it.
I insisted on the need for packaging to get software into as many hands as
possible, gave a history of the packaging ecosystem, advised people to use
packaging.python.org suggestions, and mentioned the manylinux effort. I
tried to be as objective as possible there and mention the key people
involved.
I also talked a bit about what can still be improved, and focused on 3
aspects, none of which are new nor particularly insightful for people here:
infrastructure for automatic wheel building, better decoupling of packaging
and build, and maybe more controversially, the need for tools to remove
python from the equation.
https://speakerdeck.com/cournape/python-packaging-in-2017
David
Hello Everyone!
Ralf Gommers suggested that I put this proposal here on this list, for
feedback and for seeing if anyone would be willing to mentor me. So, here
it is.
-----
My name is Pradyun Gedam. I'm currently a first year student VIT University
in India.
I would like to apply for GSoC 2017 under PSF.
I currently have a project in mind - the "pip needs a dependency resolver"
issue [1]. I would like to take on this specific project but am willing to
do some other project as well.
For some background, around mid 2016, I started contributing to pip. The
first issue I tackled was #59 [2] - a request for upgrade command and an
upgrade-all command that has been open for over 5.5 years. Over the months
following that, I've have had the opportunity to work with and understand
multiple parts of pip's codebase while working on this issue and a few
others. This search on GitHub issues [3] also provides a good summary of
what work I've done on pip.
[2]: https://github.com/pypa/pip/issues/988
[2]: https://github.com/pypa/pip/issues/59
[3]: https://github.com/pypa/pip/issues?q=author%3Apradyunsg
Eagerly-waiting-for-a-response-ly,
Pradyun Gedam
For a variety of reasons, I would like to build an index of what
modules/packages are contained in which distributions ('packages') on
PyPI. For instance:
- Identifying requirements by static analysis of code: 'import zmq' ->
requires pyzmq
- Finding corresponding packages from different packaging systems: pyzmq
on PyPI corresponds to pyzmq in conda, and python[3]-zmq in Debian
repositories. This is an oversimplification, but importable module names
provide a common basis to compare packages. I'd like a tool that could
pick between different ways of installing a given module.
People often assume that the import name is the same as the name on
PyPI. This is true in the vast majority of cases, but there's no
requirement that they are the same, and there are cases where they're
not - pyzmq is one example.
The metadata field 'Provides' is, according to PEP 314, intended for
this purpose, but the standard packaging tools don't make it easy to
use, and consequently very few packages specify it.
I have started putting together a tool to index wheels. It reads a .whl
file, finds modules inside it, and tries to identify namespace packages.
It's still quite rough, but it worked with the wheels I tried.
https://github.com/takluyver/wheeldex
Is this something that other people are interested in?
One thing I'm trying to work out at the moment is how the data would be
accessed: as a web service that tools can query online, or more like
Linux packaging, where tools download and cache a list to do lookups
locally. Or both? There's also, of course, the question of how the index
would be built and updated.
Thanks,
Thomas