I have recently downloaded python 3.6 version and I am very new to
python.I tried to open excel files in python using openpyxl but its
shows"Module not found error".So I tried to install it using pip install
openpyxl and it was showing Syntax error:'Invalid syntax'.Can you help me
in resolving the issue.
This is moved over from https://github.com/pypa/packaging-problems/issues/112.
Currently, PyPI has no limitations around deleting packages, releases,
or artifacts. This can be problematic for users, as user builds can
break in an unsolicited manner if a dependency is removed from PyPI.
In the Node ecosystem, a similar lack of limitations there caused
significant problems about a year and a half ago, when a widely-used
package was deleted following a dispute:
At the time, the scope of the impact was characterized as "breaking
To resolve this, npm adopted a policy where package deletions (there's
no distinction between a release and an artifact there) could only be
done for the first 24 hours after a release was published. Deletions
after the 24 hour mark require contacting npm support, and are
contingent on the absence of dependents for the deleted release.
Of course, npm is a venture-backed for-profit enterprise that has a
paid support team – that's not the case here, so the "support" half of
the solution above doesn't make sense here. However, as a starting
point, it might still be a good idea to restrict package deletion
after that 24-hour window.
At least, in the examples given in the links above, the deletion of
old packages is strictly a nice-to-have for the package maintainers,
balanced against potentially breaking impacts for users.
Thanks in advance for any feedback.
On 14 Dec. 2017 1:55 pm, "Donald Stufft" <donald(a)stufft.io> wrote:
Overall I’m +1, not sure if it would be a new PEP or an amendment to the
current PEPs but I think it’s a good idea.
It would be a new PEP bumping the metadata version to 1.4 (that will be
less tedious than it used to be though, since it will only have to cover
the enhancement, not duplicate all the old field descriptions).
Somwehat related though, we should probably have a summary table in PyPUG
mapping metadata versions to minimum required tooling versions & their
respective release dates. That way folks can make informed decisions about
their installability choices before deciding which version to use in their
Distutils-SIG maillist - Distutils-SIG(a)python.org
I'm about to release a new version of importlib_resources, so I want to
get my flit.ini's require-python clause right. We support Python 2.7,
and 3.4 and beyond. This makes me sad:
requires-python = '>=2.7,!=3.0,!=3.1,!=3.2,!=3.3'
Of course, I'd like to write this like:
requires-python = '(>=2.7 and <3) or >= 3.4'
I understand that OR clauses aren't supported under any syntax
currently, but as PEPs 566 and 508 are still open/active, wouldn't it be
reasonable to support something like this explicitly?
It seems like wanting to support 2.7 and some versions of Python 3 (but
not all) is a fairly common need.
I'd appreciate your feedback on the following Metadata 1.3 PEP.
The goal here is not to provide a full specification for all fields as
in previous PEPs, but to:
* Motivate and describe the addition of the new fields or changes to
* Point to the Core Metadata Specifications reference document as the
canonical source for the specification.
Title: Metadata for Python Software Packages 1.3
Author: Dustin Ingram <di(a)di.codes>
Discussions-To: distutils-sig <distutils-sig at python.org>
Type: Standards Track
This PEP describes the changes between versions 1.2 and 1.3 of the core
metadata specification for Python packages. Version 1.2 is specified in PEP
It also changes to the canonical source for field specifications to the `Core
Metadata Specification`_ reference document, which includes specifics of the
field names, and their semantics and usage.
The canonical source for the names and semantics of each of the supported
metadata fields is the `Core Metadata Specification`_ document.
Fields marked with "(Multiple use)" may be specified multiple times in a single
PKG-INFO file. Other fields may only occur once in a PKG-INFO file. Fields
marked with "(optional)" are not required to appear in a valid PKG-INFO file;
all other fields must be present.
New in Version 1.3
A string stating the markup syntax (if any) used in the distribution's
description, so that tools can intelligently render the description.
Historically, tools like PyPI assume that a package's description is formatted
in `reStructuredText (reST)
fall back on plain text if the description is not valid reST.
The introduction of this field allows PyPI to support additional types of
markup syntax, and not need to make this assumption.
The full specification for this field is defined in the `Core Metadata
Provides-Extra (optional, multiple use)
A string containing the name of an optional feature. Must be a valid Python
identifier. May be used to make a dependency conditional on whether the
optional feature has been requested.
This introduction of this field allows packge installation tools (such as
``pip``) to determine which extras are provided by a given package, and so that
package publication tools (such as ``twine``) can check for issues with
environment markers which use extras.
The full specification for this field is defined in the `Core Metadata
Changed in Version 1.3
The specification for the format of this field is now identical to the
distribution name specification defined in PEP 508.
Version numbering requirements and the semantics for specifying comparisons
between versions are defined in PEP 440.
An **environment marker** is a marker that can be added at the end of a
field after a semi-colon (";"), to add a condition about the execution
The environment marker format used to declare such a condition is defined in
the environment markers section of PEP 508.
It may be necessary to store metadata in a data structure which does not
allow for multiple repeated keys, such as JSON.
The canonical method to transform metadata fields into such a data structure is
#. The original key-value format should be read with
#. All transformed keys should be reduced to lower case, but otherwise should
retain all other characters;
#. The transformed value for any field marked with "(Multiple-use") should be a
single list containing all the original values for the given key;
#. The ``Keywords`` field should be converted to a list by splitting the
original value on whitespace characters;
#. The result should be stored as a string-keyed dictionary.
Summary of Differences From PEP 345
* Metadata-Version is now 1.3.
* Fields are now specified via the `Core Metadata Specification`_.
* Added two new fields: ``Description-Content-Type`` and ``Provides-Extra``
* Acceptable values for the ``Name`` field are now specified as per PEP 508.
* Added canonical method of transformation into JSON-compatible data structure.
This document specifies version 1.3 of the metadata format.
Version 1.0 is specified in PEP 241.
Version 1.1 is specified in PEP 314.
Version 1.2 is specified in PEP 345.
.. _`Core Metadata Specification`:
This document has been placed in the public domain.
Thanks to Nick Colgan for contributing to this PEP.
I think I implicitly knew this, but as I've just released a package (to
be announced soon) that actually has multiple authors, I found out first
hand that PyPI rejects uploads where the author-email field isn't a
completely valid email address, and that there is no support for
multiple author emails.
As it turns out, you can kludge this into your pyproject.toml or
setup.py file. flit for example separates multiple emails with a
newline, but you could also separate them with commas. You don't notice
the problem until PyPI rejects the upload (with a 400 IIRC).
I filed this issue with flit: https://github.com/takluyver/flit/issues/153
It looks like Thomas agrees that at least flit will eventually validate
its fields so you error early. It was a bit of a PITA to do my upload
because I didn't notice the problem until after I'd tagged the repo.
Multiple package authors doesn't seem like that fringe of a use case;
are there any plans, documents, PEPs, musings, grumbles about supporting
multiple package authors explicitly?
As Donald and Nick mentioned last month, several folks are -- thanks
to Mozilla Open Source Support -- working on deploying Warehouse and
making PyPI more sustainable. I'm the main project manager on this
effort and am putting notes on wiki.python.org as we go.
Ernest W. Durbin III gave the closing talk at North Bay Python a few
days ago: "Running Vintage Software: PyPI's Aging Codebase." As he
mentioned (video), he's one of the people who'll be working on this
project, funded by MOSS, to get Warehouse across the finish line! The
full cast is:
* Donald Stufft, backend
* Dustin Ingram, backend
* Ernest W. Durbin III, backend
* Laura Hampton, project management, testing, documentation
* Nicole Harris, frontend and design
* Sumana Harihareswara, project management, product management, testing,
None of us are working 40 hours/week on this but the six people I listed
above are dedicating committed time to the work. And we're grateful for
help from Mark Mangoba, Eric Holscher, and of course several other folks
who are involved in the PSF, PyPA, or MOSS.
We had a kickoff call on Monday and today a few of us spoke in
#pypa-dev and fleshed out our milestones a bit. This may still
change, but right now, the sequence is:
1. Maintainer Minimum Viable Product (for package owners) -- ask some
maintainers to use, test, and report back
2. End User MVP -- ask some Python users who aren't package owners to
use Warehouse, test, and report back
3. Publicize beta more broadly (invite more testing, redirect some `pip
4. Launch/redirect `pip` and browsers to Warehouse
5. Shut down legacy PyPI
6. Cool feature ideas that are not on critical path
We'll be putting issues about infrastructure work (e.g., Kubernetes) in
https://github.com/python/pypi-infra . We've started filing, updating,
and arranging issues but that isn't yet complete enough for me to even
start making schedule estimates. I aim to have a clearer picture next
week. (I won't spam this list with updates when there's nothing new to
say, but at least as we start I figure you might want news a few times a
month to learn how things are shaping up; please let me know privately
if my estimation there is way off.)