Mailman-coders
Threads by month
- ----- 2025 -----
- January
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2009 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2008 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2007 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2006 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2005 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2004 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2003 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
October 2014
- 12 participants
- 138 discussions
[Question #256265]: How do I make a list that Members can't Respond-All to?
by Brian Feraru Oct. 27, 2014
by Brian Feraru Oct. 27, 2014
Oct. 27, 2014
New question #256265 on mailman in Ubuntu:
https://answers.launchpad.net/ubuntu/+source/mailman/+question/256265
Hello Ubuntu Community,
I am using Mailman 2.1.17 and I am using the list as a mailing list. I want to be able to mail everyone on our mailing list without people being able to respond to the entire group. I'm trying to figure out how to set that up, but I'm not sure.
Is it with the Mail <-> Newsgateways functions?
Any assistance is appreciated. Thanks!
--
You received this question notification because you are a member of
Mailman Coders, which is an answer contact for mailman in Ubuntu.
1
0
Oct. 25, 2014
Mailman will now log the list name and handler name in the vette log
auto-discard message. While this is different from the suggested patch,
I think it is sufficient to find the discard reason.
** Changed in: mailman
Importance: Undecided => Low
** Changed in: mailman
Status: New => Fix Committed
** Changed in: mailman
Milestone: None => 2.1.19
** Changed in: mailman
Assignee: (unassigned) => Mark Sapiro (msapiro)
--
You received this bug notification because you are a member of Mailman
Coders, which is subscribed to GNU Mailman.
https://bugs.launchpad.net/bugs/558096
Title:
more verbose "discarded" message in vette log
To manage notifications about this bug go to:
https://bugs.launchpad.net/mailman/+bug/558096/+subscriptions
1
0
[Bug 558096] Re: more verbose "discarded" message in vette log
by Launchpad Bug Tracker Oct. 25, 2014
by Launchpad Bug Tracker Oct. 25, 2014
Oct. 25, 2014
** Branch linked: lp:mailman/2.1
--
You received this bug notification because you are a member of Mailman
Coders, which is subscribed to GNU Mailman.
https://bugs.launchpad.net/bugs/558096
Title:
more verbose "discarded" message in vette log
To manage notifications about this bug go to:
https://bugs.launchpad.net/mailman/+bug/558096/+subscriptions
1
0
I don't know how to comment inline in lp, so the sorry for the last comment that went empty.
In sqlite.py the persmissions should be `0666` like before I suppose? It(`0o666`) is probably just a typo in the code.
--
https://code.launchpad.net/~barry/mailman/abhilash/+merge/238222
Your team Mailman Coders is requested to review the proposed merge of lp:~barry/mailman/abhilash into lp:mailman.
2
1
Diff comments:
> === modified file 'MANIFEST.in'
> --- MANIFEST.in 2011-06-11 18:16:04 +0000
> +++ MANIFEST.in 2014-10-14 01:13:52 +0000
> @@ -1,4 +1,4 @@
> -include *.py *.rc
> +include *.py *.rc *.mako
> include COPYING
> recursive-include .buildout *
> recursive-include contrib *
> @@ -6,7 +6,6 @@
> recursive-include data *
> global-include *.txt *.rst *.po *.mo *.cfg *.sql *.zcml *.html
> global-exclude *.egg-info
> -exclude MANIFEST.in
> prune src/attic
> prune src/web
> prune eggs
>
> === modified file 'setup.py'
> --- setup.py 2014-04-15 16:06:01 +0000
> +++ setup.py 2014-10-14 01:13:52 +0000
> @@ -93,6 +93,7 @@
> 'console_scripts' : list(scripts),
> },
> install_requires = [
> + 'alembic',
> 'enum34',
> 'flufl.bounce',
> 'flufl.i18n',
> @@ -104,7 +105,7 @@
> 'nose2',
> 'passlib',
> 'restish',
> - 'storm',
> + 'sqlalchemy',
> 'zope.component',
> 'zope.configuration',
> 'zope.event',
>
> === modified file 'src/mailman/app/subscriptions.py'
> --- src/mailman/app/subscriptions.py 2014-04-15 14:03:39 +0000
> +++ src/mailman/app/subscriptions.py 2014-10-14 01:13:52 +0000
> @@ -28,7 +28,7 @@
>
> from operator import attrgetter
> from passlib.utils import generate_password as generate
> -from storm.expr import And, Or
> +from sqlalchemy import and_, or_
> from uuid import UUID
> from zope.component import getUtility
> from zope.interface import implementer
> @@ -88,9 +88,7 @@
> @dbconnection
> def get_member(self, store, member_id):
> """See `ISubscriptionService`."""
> - members = store.find(
> - Member,
> - Member._member_id == member_id)
> + members = store.query(Member).filter(Member._member_id == member_id)
> if members.count() == 0:
> return None
> else:
> @@ -117,8 +115,8 @@
> # This probably could be made more efficient.
> if address is None or user is None:
> return []
> - query.append(Or(Member.address_id == address.id,
> - Member.user_id == user.id))
> + query.append(or_(Member.address_id == address.id,
> + Member.user_id == user.id))
> else:
> # subscriber is a user id.
> user = user_manager.get_user_by_id(subscriber)
> @@ -126,15 +124,15 @@
> if address.id is not None)
> if len(address_ids) == 0 or user is None:
> return []
> - query.append(Or(Member.user_id == user.id,
> - Member.address_id.is_in(address_ids)))
> + query.append(or_(Member.user_id == user.id,
> + Member.address_id.in_(address_ids)))
> # Calculate the rest of the query expression, which will get And'd
> # with the Or clause above (if there is one).
> if list_id is not None:
> query.append(Member.list_id == list_id)
> if role is not None:
> query.append(Member.role == role)
> - results = store.find(Member, And(*query))
> + results = store.query(Member).filter(and_(*query))
> return sorted(results, key=_membership_sort_key)
>
> def __iter__(self):
>
> === modified file 'src/mailman/bin/tests/test_master.py'
> --- src/mailman/bin/tests/test_master.py 2014-04-28 15:23:35 +0000
> +++ src/mailman/bin/tests/test_master.py 2014-10-14 01:13:52 +0000
> @@ -55,7 +55,7 @@
> lock = master.acquire_lock_1(False, self.lock_file)
> is_locked = lock.is_locked
> lock.unlock()
> - self.failUnless(is_locked)
> + self.assertTrue(is_locked)
>
> def test_master_state(self):
> my_lock = Lock(self.lock_file)
>
> === modified file 'src/mailman/commands/docs/conf.rst'
> --- src/mailman/commands/docs/conf.rst 2013-09-01 15:08:46 +0000
> +++ src/mailman/commands/docs/conf.rst 2014-10-14 01:13:52 +0000
> @@ -49,6 +49,7 @@
> [logging.config] path: mailman.log
> [logging.error] path: mailman.log
> [logging.smtp] path: smtp.log
> + [logging.database] path: mailman.log
> [logging.http] path: mailman.log
> [logging.root] path: mailman.log
> [logging.fromusenet] path: mailman.log
>
> === added file 'src/mailman/config/alembic.cfg'
> --- src/mailman/config/alembic.cfg 1970-01-01 00:00:00 +0000
> +++ src/mailman/config/alembic.cfg 2014-10-14 01:13:52 +0000
> @@ -0,0 +1,20 @@
> +# Copyright (C) 2014 by the Free Software Foundation, Inc.
> +#
> +# This file is part of GNU Mailman.
> +#
> +# GNU Mailman is free software: you can redistribute it and/or modify it under
> +# the terms of the GNU General Public License as published by the Free
> +# Software Foundation, either version 3 of the License, or (at your option)
> +# any later version.
> +#
> +# GNU Mailman is distributed in the hope that it will be useful, but WITHOUT
> +# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
> +# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
> +# more details.
> +#
> +# You should have received a copy of the GNU General Public License along with
> +# GNU Mailman. If not, see <http://www.gnu.org/licenses/>.
> +
> +[alembic]
> +# Path to Alembic migration scripts.
> +script_location: mailman.database:alembic
>
> === modified file 'src/mailman/config/config.py'
> --- src/mailman/config/config.py 2014-03-02 22:59:30 +0000
> +++ src/mailman/config/config.py 2014-10-14 01:13:52 +0000
> @@ -33,7 +33,7 @@
> from ConfigParser import SafeConfigParser
> from flufl.lock import Lock
> from lazr.config import ConfigSchema, as_boolean
> -from pkg_resources import resource_filename, resource_stream, resource_string
> +from pkg_resources import resource_stream, resource_string
> from string import Template
> from zope.component import getUtility
> from zope.event import notify
> @@ -46,7 +46,7 @@
> ConfigurationUpdatedEvent, IConfiguration, MissingConfigurationFileError)
> from mailman.interfaces.languages import ILanguageManager
> from mailman.utilities.filesystem import makedirs
> -from mailman.utilities.modules import call_name
> +from mailman.utilities.modules import call_name, expand_path
>
>
> SPACE = ' '
> @@ -304,12 +304,7 @@
> :return: A `ConfigParser` instance.
> """
> # Is the context coming from a file system or Python path?
> - if path.startswith('python:'):
> - resource_path = path[7:]
> - package, dot, resource = resource_path.rpartition('.')
> - cfg_path = resource_filename(package, resource + '.cfg')
> - else:
> - cfg_path = path
> + cfg_path = expand_path(path)
> parser = SafeConfigParser()
> files = parser.read(cfg_path)
> if files != [cfg_path]:
>
> === modified file 'src/mailman/config/configure.zcml'
> --- src/mailman/config/configure.zcml 2013-11-26 02:26:15 +0000
> +++ src/mailman/config/configure.zcml 2014-10-14 01:13:52 +0000
> @@ -40,20 +40,6 @@
> factory="mailman.model.requests.ListRequests"
> />
>
> - <adapter
> - for="mailman.interfaces.database.IDatabase"
> - provides="mailman.interfaces.database.ITemporaryDatabase"
> - factory="mailman.database.sqlite.make_temporary"
> - name="sqlite"
> - />
> -
> - <adapter
> - for="mailman.interfaces.database.IDatabase"
> - provides="mailman.interfaces.database.ITemporaryDatabase"
> - factory="mailman.database.postgresql.make_temporary"
> - name="postgres"
> - />
> -
> <utility
> provides="mailman.interfaces.bounce.IBounceProcessor"
> factory="mailman.model.bounce.BounceProcessor"
> @@ -72,12 +58,6 @@
> />
>
> <utility
> - provides="mailman.interfaces.database.IDatabaseFactory"
> - factory="mailman.database.factory.DatabaseTemporaryFactory"
> - name="temporary"
> - />
> -
> - <utility
> provides="mailman.interfaces.domain.IDomainManager"
> factory="mailman.model.domain.DomainManager"
> />
>
> === modified file 'src/mailman/config/schema.cfg'
> --- src/mailman/config/schema.cfg 2014-01-01 14:59:42 +0000
> +++ src/mailman/config/schema.cfg 2014-10-14 01:13:52 +0000
> @@ -204,9 +204,6 @@
> url: sqlite:///$DATA_DIR/mailman.db
> debug: no
>
> -# The module path to the migrations modules.
> -migrations_path: mailman.database.schema
> -
> [logging.template]
> # This defines various log settings. The options available are:
> #
> @@ -229,6 +226,7 @@
> # - archiver -- All archiver output
> # - bounce -- All bounce processing logs go here
> # - config -- Configuration issues
> +# - database -- Database logging (SQLAlchemy and Alembic)
> # - debug -- Only used for development
> # - error -- All exceptions go to this log
> # - fromusenet -- Information related to the Usenet to Mailman gateway
> @@ -255,6 +253,8 @@
>
> [logging.config]
>
> +[logging.database]
> +
> [logging.debug]
> path: debug.log
> level: info
> @@ -304,6 +304,9 @@
>
> [logging.vette]
>
> +[logging.database]
> +level: warn
> +
>
> [webservice]
> # The hostname at which admin web service resources are exposed.
> @@ -532,7 +535,7 @@
> # following values.
>
> # The class implementing the IArchiver interface.
> -class:
> +class:
>
> # Set this to 'yes' to enable the archiver.
> enable: no
>
> === modified file 'src/mailman/core/logging.py'
> --- src/mailman/core/logging.py 2014-04-28 15:23:35 +0000
> +++ src/mailman/core/logging.py 2014-10-14 01:13:52 +0000
> @@ -104,6 +104,27 @@
>
>
>
> +def _init_logger(propagate, sub_name, log, logger_config):
> + # Get settings from log configuration file (or defaults).
> + log_format = logger_config.format
> + log_datefmt = logger_config.datefmt
> + # Propagation to the root logger is how we handle logging to stderr
> + # when the runners are not run as a subprocess of 'bin/mailman start'.
> + log.propagate = (as_boolean(logger_config.propagate)
> + if propagate is None else propagate)
> + # Set the logger's level.
> + log.setLevel(as_log_level(logger_config.level))
> + # Create a formatter for this logger, then a handler, and link the
> + # formatter to the handler.
> + formatter = logging.Formatter(fmt=log_format, datefmt=log_datefmt)
> + path_str = logger_config.path
> + path_abs = os.path.normpath(os.path.join(config.LOG_DIR, path_str))
> + handler = ReopenableFileHandler(sub_name, path_abs)
> + _handlers[sub_name] = handler
> + handler.setFormatter(formatter)
> + log.addHandler(handler)
> +
> +
> def initialize(propagate=None):
> """Initialize all logs.
>
> @@ -126,28 +147,18 @@
> continue
> if sub_name == 'locks':
> log = logging.getLogger('flufl.lock')
> + if sub_name == 'database':
> + # Set both the SQLAlchemy and Alembic logs to the mailman.database
> + # log configuration, essentially ignoring the alembic.cfg
> + # settings. Do the SQLAlchemy one first, then let the Alembic one
> + # fall through to the common code path.
> + log = logging.getLogger('sqlalchemy')
> + _init_logger(propagate, sub_name, log, logger_config)
> + log = logging.getLogger('alembic')
> else:
> logger_name = 'mailman.' + sub_name
> log = logging.getLogger(logger_name)
> - # Get settings from log configuration file (or defaults).
> - log_format = logger_config.format
> - log_datefmt = logger_config.datefmt
> - # Propagation to the root logger is how we handle logging to stderr
> - # when the runners are not run as a subprocess of 'bin/mailman start'.
> - log.propagate = (as_boolean(logger_config.propagate)
> - if propagate is None else propagate)
> - # Set the logger's level.
> - log.setLevel(as_log_level(logger_config.level))
> - # Create a formatter for this logger, then a handler, and link the
> - # formatter to the handler.
> - formatter = logging.Formatter(fmt=log_format, datefmt=log_datefmt)
> - path_str = logger_config.path
> - path_abs = os.path.normpath(os.path.join(config.LOG_DIR, path_str))
> - handler = ReopenableFileHandler(sub_name, path_abs)
> - _handlers[sub_name] = handler
> - handler.setFormatter(formatter)
> - log.addHandler(handler)
> -
> + _init_logger(propagate, sub_name, log, logger_config)
>
>
> def reopen():
>
> === added directory 'src/mailman/database/alembic'
> === added file 'src/mailman/database/alembic/__init__.py'
> --- src/mailman/database/alembic/__init__.py 1970-01-01 00:00:00 +0000
> +++ src/mailman/database/alembic/__init__.py 2014-10-14 01:13:52 +0000
> @@ -0,0 +1,32 @@
> +# Copyright (C) 2014 by the Free Software Foundation, Inc.
> +#
> +# This file is part of GNU Mailman.
> +#
> +# GNU Mailman is free software: you can redistribute it and/or modify it under
> +# the terms of the GNU General Public License as published by the Free
> +# Software Foundation, either version 3 of the License, or (at your option)
> +# any later version.
> +#
> +# GNU Mailman is distributed in the hope that it will be useful, but WITHOUT
> +# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
> +# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
> +# more details.
> +#
> +# You should have received a copy of the GNU General Public License along with
> +# GNU Mailman. If not, see <http://www.gnu.org/licenses/>.
> +
> +"""Alembic configuration initization."""
> +
> +from __future__ import absolute_import, print_function, unicode_literals
> +
> +__metaclass__ = type
> +__all__ = [
> + 'alembic_cfg',
> + ]
> +
> +
> +from alembic.config import Config
> +from mailman.utilities.modules import expand_path
> +
> +
> +alembic_cfg = Config(expand_path('python:mailman.config.alembic'))
>
> === added file 'src/mailman/database/alembic/env.py'
> --- src/mailman/database/alembic/env.py 1970-01-01 00:00:00 +0000
> +++ src/mailman/database/alembic/env.py 2014-10-14 01:13:52 +0000
> @@ -0,0 +1,75 @@
> +# Copyright (C) 2014 by the Free Software Foundation, Inc.
> +#
> +# This file is part of GNU Mailman.
> +#
> +# GNU Mailman is free software: you can redistribute it and/or modify it under
> +# the terms of the GNU General Public License as published by the Free
> +# Software Foundation, either version 3 of the License, or (at your option)
> +# any later version.
> +#
> +# GNU Mailman is distributed in the hope that it will be useful, but WITHOUT
> +# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
> +# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
> +# more details.
> +#
> +# You should have received a copy of the GNU General Public License along with
> +# GNU Mailman. If not, see <http://www.gnu.org/licenses/>.
> +
> +"""Alembic migration environment."""
> +
> +from __future__ import absolute_import, print_function, unicode_literals
> +
> +__metaclass__ = type
> +__all__ = [
> + 'run_migrations_offline',
> + 'run_migrations_online',
> + ]
> +
> +
> +from alembic import context
> +from contextlib import closing
> +from sqlalchemy import create_engine
> +
> +from mailman.config import config
> +from mailman.database.model import Model
> +from mailman.utilities.string import expand
> +
> +
> +
> +def run_migrations_offline():
> + """Run migrations in 'offline' mode.
> +
> + This configures the context with just a URL and not an Engine,
> + though an Engine is acceptable here as well. By skipping the Engine
> + creation we don't even need a DBAPI to be available.
> +
> + Calls to context.execute() here emit the given string to the script
> + output.
> + """
> + url = expand(config.database.url, config.paths)
> + context.configure(url=url, target_metadata=Model.metadata)
> + with context.begin_transaction():
> + context.run_migrations()
> +
> +
> +def run_migrations_online():
> + """Run migrations in 'online' mode.
> +
> + In this scenario we need to create an Engine and associate a
> + connection with the context.
> + """
> + url = expand(config.database.url, config.paths)
> + engine = create_engine(url)
> +
> + connection = engine.connect()
> + with closing(connection):
> + context.configure(
> + connection=connection, target_metadata=Model.metadata)
> + with context.begin_transaction():
> + context.run_migrations()
> +
> +
> +if context.is_offline_mode():
> + run_migrations_offline()
> +else:
> + run_migrations_online()
>
> === added file 'src/mailman/database/alembic/script.py.mako'
> --- src/mailman/database/alembic/script.py.mako 1970-01-01 00:00:00 +0000
> +++ src/mailman/database/alembic/script.py.mako 2014-10-14 01:13:52 +0000
> @@ -0,0 +1,22 @@
> +"""${message}
> +
> +Revision ID: ${up_revision}
> +Revises: ${down_revision}
> +Create Date: ${create_date}
> +
> +"""
> +
> +# revision identifiers, used by Alembic.
> +revision = ${repr(up_revision)}
> +down_revision = ${repr(down_revision)}
> +
> +from alembic import op
> +import sqlalchemy as sa
> +${imports if imports else ""}
> +
> +def upgrade():
> + ${upgrades if upgrades else "pass"}
> +
> +
> +def downgrade():
> + ${downgrades if downgrades else "pass"}
>
> === added directory 'src/mailman/database/alembic/versions'
> === added file 'src/mailman/database/alembic/versions/51b7f92bd06c_initial.py'
> --- src/mailman/database/alembic/versions/51b7f92bd06c_initial.py 1970-01-01 00:00:00 +0000
> +++ src/mailman/database/alembic/versions/51b7f92bd06c_initial.py 2014-10-14 01:13:52 +0000
> @@ -0,0 +1,66 @@
> +# Copyright (C) 2014 by the Free Software Foundation, Inc.
> +#
> +# This file is part of GNU Mailman.
> +#
> +# GNU Mailman is free software: you can redistribute it and/or modify it under
> +# the terms of the GNU General Public License as published by the Free
> +# Software Foundation, either version 3 of the License, or (at your option)
> +# any later version.
> +#
> +# GNU Mailman is distributed in the hope that it will be useful, but WITHOUT
> +# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
> +# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
> +# more details.
> +#
> +# You should have received a copy of the GNU General Public License along with
> +# GNU Mailman. If not, see <http://www.gnu.org/licenses/>.
> +
> +"""Initial migration.
> +
> +This empty migration file makes sure there is always an alembic_version
> +in the database. As a consequence, if the database version is reported
> +as None, it means the database needs to be created from scratch with
> +SQLAlchemy itself.
> +
> +It also removes schema items left over from Storm.
> +
> +Revision ID: 51b7f92bd06c
> +Revises: None
> +Create Date: 2014-10-10 09:53:35.624472
> +"""
> +
> +from __future__ import absolute_import, print_function, unicode_literals
> +
> +__metaclass__ = type
> +__all__ = [
> + 'downgrade',
> + 'upgrade',
> + ]
> +
> +
> +from alembic import op
> +import sqlalchemy as sa
> +
> +
> +# Revision identifiers, used by Alembic.
> +revision = '51b7f92bd06c'
> +down_revision = None
> +
> +
> +def upgrade():
> + op.drop_table('version')
> + if op.get_bind().dialect.name != 'sqlite':
> + # SQLite does not support dropping columns.
> + op.drop_column('mailinglist', 'acceptable_aliases_id')
> + op.create_index(op.f('ix_user__user_id'), 'user',
> + ['_user_id'], unique=False)
> + op.drop_index('ix_user_user_id', table_name='user')
> +
> +
> +def downgrade():
> + op.create_table('version')
> + op.create_index('ix_user_user_id', 'user', ['_user_id'], unique=False)
> + op.drop_index(op.f('ix_user__user_id'), table_name='user')
> + op.add_column(
> + 'mailinglist',
> + sa.Column('acceptable_aliases_id', sa.INTEGER(), nullable=True))
>
> === modified file 'src/mailman/database/base.py'
> --- src/mailman/database/base.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/database/base.py 2014-10-14 01:13:52 +0000
> @@ -19,49 +19,39 @@
>
> __metaclass__ = type
> __all__ = [
> - 'StormBaseDatabase',
> + 'SABaseDatabase',
> ]
>
>
> -import os
> -import sys
> import logging
>
> -from lazr.config import as_boolean
> -from pkg_resources import resource_listdir, resource_string
> -from storm.cache import GenerationalCache
> -from storm.locals import create_database, Store
> +from sqlalchemy import create_engine
> +from sqlalchemy.orm import sessionmaker
> from zope.interface import implementer
>
> from mailman.config import config
> from mailman.interfaces.database import IDatabase
> -from mailman.model.version import Version
> from mailman.utilities.string import expand
>
> +
> log = logging.getLogger('mailman.config')
> -
> NL = '\n'
>
>
>
> @implementer(IDatabase)
> -class StormBaseDatabase:
> - """The database base class for use with the Storm ORM.
> +class SABaseDatabase:
> + """The database base class for use with SQLAlchemy.
>
> - Use this as a base class for your DB-specific derived classes.
> + Use this as a base class for your DB-Specific derived classes.
> """
> -
> - # Tag used to distinguish the database being used. Override this in base
> - # classes.
> - TAG = ''
> -
> def __init__(self):
> self.url = None
> self.store = None
>
> def begin(self):
> """See `IDatabase`."""
> - # Storm takes care of this for us.
> + # SQLAlchemy does this for us.
> pass
>
> def commit(self):
> @@ -72,16 +62,6 @@
> """See `IDatabase`."""
> self.store.rollback()
>
> - def _database_exists(self):
> - """Return True if the database exists and is initialized.
> -
> - Return False when Mailman needs to create and initialize the
> - underlying database schema.
> -
> - Base classes *must* override this.
> - """
> - raise NotImplementedError
> -
> def _pre_reset(self, store):
> """Clean up method for testing.
>
> @@ -113,6 +93,7 @@
> """See `IDatabase`."""
> # Calculate the engine url.
> url = expand(config.database.url, config.paths)
> + self._prepare(url)
> log.debug('Database url: %s', url)
> # XXX By design of SQLite, database file creation does not honor
> # umask. See their ticket #1193:
> @@ -129,101 +110,7 @@
> # engines, and yes, we could have chmod'd the file after the fact, but
> # half dozen and all...
> self.url = url
> - self._prepare(url)
> - database = create_database(url)
> - store = Store(database, GenerationalCache())
> - database.DEBUG = (as_boolean(config.database.debug)
> - if debug is None else debug)
> - self.store = store
> - store.commit()
> -
> - def load_migrations(self, until=None):
> - """Load schema migrations.
> -
> - :param until: Load only the migrations up to the specified timestamp.
> - With default value of None, load all migrations.
> - :type until: string
> - """
> - migrations_path = config.database.migrations_path
> - if '.' in migrations_path:
> - parent, dot, child = migrations_path.rpartition('.')
> - else:
> - parent = migrations_path
> - child = ''
> - # If the database does not yet exist, load the base schema.
> - filenames = sorted(resource_listdir(parent, child))
> - # Find out which schema migrations have already been loaded.
> - if self._database_exists(self.store):
> - versions = set(version.version for version in
> - self.store.find(Version, component='schema'))
> - else:
> - versions = set()
> - for filename in filenames:
> - module_fn, extension = os.path.splitext(filename)
> - if extension != '.py':
> - continue
> - parts = module_fn.split('_')
> - if len(parts) < 2:
> - continue
> - version = parts[1].strip()
> - if len(version) == 0:
> - # Not a schema migration file.
> - continue
> - if version in versions:
> - log.debug('already migrated to %s', version)
> - continue
> - if until is not None and version > until:
> - # We're done.
> - break
> - module_path = migrations_path + '.' + module_fn
> - __import__(module_path)
> - upgrade = getattr(sys.modules[module_path], 'upgrade', None)
> - if upgrade is None:
> - continue
> - log.debug('migrating db to %s: %s', version, module_path)
> - upgrade(self, self.store, version, module_path)
> - self.commit()
> -
> - def load_sql(self, store, sql):
> - """Load the given SQL into the store.
> -
> - :param store: The Storm store to load the schema into.
> - :type store: storm.locals.Store`
> - :param sql: The possibly multi-line SQL to load.
> - :type sql: string
> - """
> - # Discard all blank and comment lines.
> - lines = (line for line in sql.splitlines()
> - if line.strip() != '' and line.strip()[:2] != '--')
> - sql = NL.join(lines)
> - for statement in sql.split(';'):
> - if statement.strip() != '':
> - store.execute(statement + ';')
> -
> - def load_schema(self, store, version, filename, module_path):
> - """Load the schema from a file.
> -
> - This is a helper method for migration classes to call.
> -
> - :param store: The Storm store to load the schema into.
> - :type store: storm.locals.Store`
> - :param version: The schema version identifier of the form
> - YYYYMMDDHHMMSS.
> - :type version: string
> - :param filename: The file name containing the schema to load. Pass
> - `None` if there is no schema file to load.
> - :type filename: string
> - :param module_path: The fully qualified Python module path to the
> - migration module being loaded. This is used to record information
> - for use by the test suite.
> - :type module_path: string
> - """
> - if filename is not None:
> - contents = resource_string('mailman.database.schema', filename)
> - self.load_sql(store, contents)
> - # Add a marker that indicates the migration version being applied.
> - store.add(Version(component='schema', version=version))
> -
> - @staticmethod
> - def _make_temporary():
> - raise NotImplementedError
> + self.engine = create_engine(url)
> + session = sessionmaker(bind=self.engine)
> + self.store = session()
> + self.store.commit()
>
> === removed directory 'src/mailman/database/docs'
> === removed file 'src/mailman/database/docs/__init__.py'
> === removed file 'src/mailman/database/docs/migration.rst'
> --- src/mailman/database/docs/migration.rst 2014-04-28 15:23:35 +0000
> +++ src/mailman/database/docs/migration.rst 1970-01-01 00:00:00 +0000
> @@ -1,207 +0,0 @@
> -=================
> -Schema migrations
> -=================
> -
> -The SQL database schema will over time require upgrading to support new
> -features. This is supported via schema migration.
> -
> -Migrations are embodied in individual Python classes, which themselves may
> -load SQL into the database. The naming scheme for migration files is:
> -
> - mm_YYYYMMDDHHMMSS_comment.py
> -
> -where `YYYYMMDDHHMMSS` is a required numeric year, month, day, hour, minute,
> -and second specifier providing unique ordering for processing. Only this
> -component of the file name is used to determine the ordering. The prefix is
> -required due to Python module naming requirements, but it is actually
> -ignored. `mm_` is reserved for Mailman's own use.
> -
> -The optional `comment` part of the file name can be used as a short
> -description for the migration, although comments and docstrings in the
> -migration files should be used for more detailed descriptions.
> -
> -Migrations are applied automatically when Mailman starts up, but can also be
> -applied at any time by calling in the API directly. Once applied, a
> -migration's version string is registered so it will not be applied again.
> -
> -We see that the base migration, as well as subsequent standard migrations, are
> -already applied.
> -
> - >>> from mailman.model.version import Version
> - >>> results = config.db.store.find(Version, component='schema')
> - >>> results.count()
> - 4
> - >>> versions = sorted(result.version for result in results)
> - >>> for version in versions:
> - ... print(version)
> - 00000000000000
> - 20120407000000
> - 20121015000000
> - 20130406000000
> -
> -
> -Migrations
> -==========
> -
> -Migrations can be loaded at any time, and can be found in the migrations path
> -specified in the configuration file.
> -
> -.. Create a temporary directory for the migrations::
> -
> - >>> import os, sys, tempfile
> - >>> tempdir = tempfile.mkdtemp()
> - >>> path = os.path.join(tempdir, 'migrations')
> - >>> os.makedirs(path)
> - >>> sys.path.append(tempdir)
> - >>> config.push('migrations', """
> - ... [database]
> - ... migrations_path: migrations
> - ... """)
> -
> -.. Clean this up at the end of the doctest.
> - >>> def cleanup():
> - ... import shutil
> - ... from mailman.config import config
> - ... config.pop('migrations')
> - ... shutil.rmtree(tempdir)
> - >>> cleanups.append(cleanup)
> -
> -Here is an example migrations module. The key part of this interface is the
> -``upgrade()`` method, which takes four arguments:
> -
> - * `database` - The database class, as derived from `StormBaseDatabase`
> - * `store` - The Storm `Store` object.
> - * `version` - The version string as derived from the migrations module's file
> - name. This will include only the `YYYYMMDDHHMMSS` string.
> - * `module_path` - The dotted module path to the migrations module, suitable
> - for lookup in `sys.modules`.
> -
> -This migration module just adds a marker to the `version` table.
> -
> - >>> with open(os.path.join(path, '__init__.py'), 'w') as fp:
> - ... pass
> - >>> with open(os.path.join(path, 'mm_20159999000000.py'), 'w') as fp:
> - ... print("""
> - ... from __future__ import unicode_literals
> - ... from mailman.model.version import Version
> - ... def upgrade(database, store, version, module_path):
> - ... v = Version(component='test', version=version)
> - ... store.add(v)
> - ... database.load_schema(store, version, None, module_path)
> - ... """, file=fp)
> -
> -This will load the new migration, since it hasn't been loaded before.
> -
> - >>> config.db.load_migrations()
> - >>> results = config.db.store.find(Version, component='schema')
> - >>> for result in sorted(result.version for result in results):
> - ... print(result)
> - 00000000000000
> - 20120407000000
> - 20121015000000
> - 20130406000000
> - 20159999000000
> - >>> test = config.db.store.find(Version, component='test').one()
> - >>> print(test.version)
> - 20159999000000
> -
> -Migrations will only be loaded once.
> -
> - >>> with open(os.path.join(path, 'mm_20159999000001.py'), 'w') as fp:
> - ... print("""
> - ... from __future__ import unicode_literals
> - ... from mailman.model.version import Version
> - ... _marker = 801
> - ... def upgrade(database, store, version, module_path):
> - ... global _marker
> - ... # Pad enough zeros on the left to reach 14 characters wide.
> - ... marker = '{0:=#014d}'.format(_marker)
> - ... _marker += 1
> - ... v = Version(component='test', version=marker)
> - ... store.add(v)
> - ... database.load_schema(store, version, None, module_path)
> - ... """, file=fp)
> -
> -The first time we load this new migration, we'll get the 801 marker.
> -
> - >>> config.db.load_migrations()
> - >>> results = config.db.store.find(Version, component='schema')
> - >>> for result in sorted(result.version for result in results):
> - ... print(result)
> - 00000000000000
> - 20120407000000
> - 20121015000000
> - 20130406000000
> - 20159999000000
> - 20159999000001
> - >>> test = config.db.store.find(Version, component='test')
> - >>> for marker in sorted(marker.version for marker in test):
> - ... print(marker)
> - 00000000000801
> - 20159999000000
> -
> -We do not get an 802 marker because the migration has already been loaded.
> -
> - >>> config.db.load_migrations()
> - >>> results = config.db.store.find(Version, component='schema')
> - >>> for result in sorted(result.version for result in results):
> - ... print(result)
> - 00000000000000
> - 20120407000000
> - 20121015000000
> - 20130406000000
> - 20159999000000
> - 20159999000001
> - >>> test = config.db.store.find(Version, component='test')
> - >>> for marker in sorted(marker.version for marker in test):
> - ... print(marker)
> - 00000000000801
> - 20159999000000
> -
> -
> -Partial upgrades
> -================
> -
> -It's possible (mostly for testing purposes) to only do a partial upgrade, by
> -providing a timestamp to `load_migrations()`. To demonstrate this, we add two
> -additional migrations, intended to be applied in sequential order.
> -
> - >>> from shutil import copyfile
> - >>> from mailman.testing.helpers import chdir
> - >>> with chdir(path):
> - ... copyfile('mm_20159999000000.py', 'mm_20159999000002.py')
> - ... copyfile('mm_20159999000000.py', 'mm_20159999000003.py')
> - ... copyfile('mm_20159999000000.py', 'mm_20159999000004.py')
> -
> -Now, only migrate to the ...03 timestamp.
> -
> - >>> config.db.load_migrations('20159999000003')
> -
> -You'll notice that the ...04 version is not present.
> -
> - >>> results = config.db.store.find(Version, component='schema')
> - >>> for result in sorted(result.version for result in results):
> - ... print(result)
> - 00000000000000
> - 20120407000000
> - 20121015000000
> - 20130406000000
> - 20159999000000
> - 20159999000001
> - 20159999000002
> - 20159999000003
> -
> -
> -.. cleanup:
> - Because the Version table holds schema migration data, it will not be
> - cleaned up by the standard test suite. This is generally not a problem
> - for SQLite since each test gets a new database file, but for PostgreSQL,
> - this will cause migration.rst to fail on subsequent runs. So let's just
> - clean up the database explicitly.
> -
> - >>> if config.db.TAG != 'sqlite':
> - ... results = config.db.store.execute("""
> - ... DELETE FROM version WHERE version.version >= '201299990000'
> - ... OR version.component = 'test';
> - ... """)
> - ... config.db.commit()
>
> === modified file 'src/mailman/database/factory.py'
> --- src/mailman/database/factory.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/database/factory.py 2014-10-14 01:13:52 +0000
> @@ -22,25 +22,32 @@
> __metaclass__ = type
> __all__ = [
> 'DatabaseFactory',
> - 'DatabaseTemporaryFactory',
> 'DatabaseTestingFactory',
> ]
>
>
> import os
> import types
> +import alembic.command
>
> +from alembic.migration import MigrationContext
> +from alembic.script import ScriptDirectory
> from flufl.lock import Lock
> -from zope.component import getAdapter
> +from sqlalchemy import MetaData
> from zope.interface import implementer
> from zope.interface.verify import verifyObject
>
> from mailman.config import config
> +from mailman.database.alembic import alembic_cfg
> +from mailman.database.model import Model
> from mailman.interfaces.database import (
> - IDatabase, IDatabaseFactory, ITemporaryDatabase)
> + DatabaseError, IDatabase, IDatabaseFactory)
> from mailman.utilities.modules import call_name
>
>
> +LAST_STORM_SCHEMA_VERSION = '20130406000000'
> +
> +
>
> @implementer(IDatabaseFactory)
> class DatabaseFactory:
> @@ -54,18 +61,69 @@
> database = call_name(database_class)
> verifyObject(IDatabase, database)
> database.initialize()
> - database.load_migrations()
> + SchemaManager(database).setup_database()
> database.commit()
> return database
>
>
>
> +class SchemaManager:
> + "Manage schema migrations."""
> +
> + def __init__(self, database):
> + self._database = database
> + self._script = ScriptDirectory.from_config(alembic_cfg)
> +
> + def _get_storm_schema_version(self):
> + metadata = MetaData()
> + metadata.reflect(bind=self._database.engine)
> + if 'version' not in metadata.tables:
> + # There are no Storm artifacts left.
> + return None
> + Version = metadata.tables['version']
> + last_version = self._database.store.query(Version.c.version).filter(
> + Version.c.component == 'schema'
> + ).order_by(Version.c.version.desc()).first()
> + # Don't leave open transactions or they will block any schema change.
> + self._database.commit()
> + return last_version
> +
> + def setup_database(self):
> + context = MigrationContext.configure(self._database.store.connection())
> + current_rev = context.get_current_revision()
> + head_rev = self._script.get_current_head()
> + if current_rev == head_rev:
> + # We're already at the latest revision so there's nothing to do.
> + return head_rev
> + if current_rev is None:
> + # No Alembic information is available.
> + storm_version = self._get_storm_schema_version()
> + if storm_version is None:
> + # Initial database creation.
> + Model.metadata.create_all(self._database.engine)
> + self._database.commit()
> + alembic.command.stamp(alembic_cfg, 'head')
> + else:
> + # The database was previously managed by Storm.
> + if storm_version.version < LAST_STORM_SCHEMA_VERSION:
> + raise DatabaseError(
> + 'Upgrades skipping beta versions is not supported.')
> + # Run migrations to remove the Storm-specific table and upgrade
> + # to SQLAlchemy and Alembic.
> + alembic.command.upgrade(alembic_cfg, 'head')
> + elif current_rev != head_rev:
> + alembic.command.upgrade(alembic_cfg, 'head')
> + return head_rev
> +
> +
> +
> def _reset(self):
> """See `IDatabase`."""
> - from mailman.database.model import ModelMeta
> + # Avoid a circular import at module level.
> + from mailman.database.model import Model
> self.store.rollback()
> self._pre_reset(self.store)
> - ModelMeta._reset(self.store)
> + Model._reset(self)
> self._post_reset(self.store)
> self.store.commit()
>
> @@ -81,24 +139,8 @@
> database = call_name(database_class)
> verifyObject(IDatabase, database)
> database.initialize()
> - database.load_migrations()
> + Model.metadata.create_all(database.engine)
> database.commit()
> # Make _reset() a bound method of the database instance.
> database._reset = types.MethodType(_reset, database)
> return database
> -
> -
> -
> -@implementer(IDatabaseFactory)
> -class DatabaseTemporaryFactory:
> - """Create a temporary database for some of the migration tests."""
> -
> - @staticmethod
> - def create():
> - """See `IDatabaseFactory`."""
> - database_class_name = config.database['class']
> - database = call_name(database_class_name)
> - verifyObject(IDatabase, database)
> - adapted_database = getAdapter(
> - database, ITemporaryDatabase, database.TAG)
> - return adapted_database
>
> === modified file 'src/mailman/database/model.py'
> --- src/mailman/database/model.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/database/model.py 2014-10-14 01:13:52 +0000
> @@ -25,44 +25,34 @@
> ]
>
>
> -from operator import attrgetter
> -
> -from storm.properties import PropertyPublisherMeta
> -
> -
> -
> -class ModelMeta(PropertyPublisherMeta):
> - """Do more magic on table classes."""
> -
> - _class_registry = set()
> -
> - def __init__(self, name, bases, dict):
> - # Before we let the base class do it's thing, force an __storm_table__
> - # property to enforce our table naming convention.
> - self.__storm_table__ = name.lower()
> - super(ModelMeta, self).__init__(name, bases, dict)
> - # Register the model class so that it can be more easily cleared.
> - # This is required by the test framework so that the corresponding
> - # table can be reset between tests.
> - #
> - # The PRESERVE flag indicates whether the table should be reset or
> - # not. We have to handle the actual Model base class explicitly
> - # because it does not correspond to a table in the database.
> - if not getattr(self, 'PRESERVE', False) and name != 'Model':
> - ModelMeta._class_registry.add(self)
> -
> +import contextlib
> +
> +from sqlalchemy.ext.declarative import declarative_base
> +
> +from mailman.config import config
> +
> +
> +class ModelMeta:
> + """The custom metaclass for all model base classes.
> +
> + This is used in the test suite to quickly reset the database after each
> + test. It works by iterating over all the tables, deleting each. The test
> + suite will then recreate the tables before each test.
> + """
> @staticmethod
> - def _reset(store):
> - from mailman.config import config
> - config.db._pre_reset(store)
> - # Make sure this is deterministic, by sorting on the storm table name.
> - classes = sorted(ModelMeta._class_registry,
> - key=attrgetter('__storm_table__'))
> - for model_class in classes:
> - store.find(model_class).remove()
> -
> -
> -
> -class Model:
> - """Like Storm's `Storm` subclass, but with a bit extra."""
> - __metaclass__ = ModelMeta
> + def _reset(db):
> + with contextlib.closing(config.db.engine.connect()) as connection:
> + transaction = connection.begin()
> + try:
> + # Delete all the tables in reverse foreign key dependency
> + # order. http://tinyurl.com/on8dy6f
> + for table in reversed(Model.metadata.sorted_tables):
> + connection.execute(table.delete())
> + except:
> + transaction.rollback()
> + raise
> + else:
> + transaction.commit()
> +
> +
> +Model = declarative_base(cls=ModelMeta)
>
> === modified file 'src/mailman/database/postgresql.py'
> --- src/mailman/database/postgresql.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/database/postgresql.py 2014-10-14 01:13:52 +0000
> @@ -22,34 +22,17 @@
> __metaclass__ = type
> __all__ = [
> 'PostgreSQLDatabase',
> - 'make_temporary',
> ]
>
>
> -import types
> -
> -from functools import partial
> -from operator import attrgetter
> -from urlparse import urlsplit, urlunsplit
> -
> -from mailman.database.base import StormBaseDatabase
> -from mailman.testing.helpers import configuration
> +from mailman.database.base import SABaseDatabase
> +from mailman.database.model import Model
>
>
>
> -class PostgreSQLDatabase(StormBaseDatabase):
> +class PostgreSQLDatabase(SABaseDatabase):
> """Database class for PostgreSQL."""
>
> - TAG = 'postgres'
> -
> - def _database_exists(self, store):
> - """See `BaseDatabase`."""
> - table_query = ('SELECT table_name FROM information_schema.tables '
> - "WHERE table_schema = 'public'")
> - results = store.execute(table_query)
> - table_names = set(item[0] for item in results)
> - return 'version' in table_names
> -
> def _post_reset(self, store):
> """PostgreSQL-specific test suite cleanup.
>
> @@ -57,49 +40,13 @@
> restart from zero for new tests.
> """
> super(PostgreSQLDatabase, self)._post_reset(store)
> - from mailman.database.model import ModelMeta
> - classes = sorted(ModelMeta._class_registry,
> - key=attrgetter('__storm_table__'))
> + tables = reversed(Model.metadata.sorted_tables)
> # Recipe adapted from
> # http://stackoverflow.com/questions/544791/
> # django-postgresql-how-to-reset-primary-key
> - for model_class in classes:
> + for table in tables:
> store.execute("""\
> SELECT setval('"{0}_id_seq"', coalesce(max("id"), 1),
> max("id") IS NOT null)
> FROM "{0}";
> - """.format(model_class.__storm_table__))
> -
> -
> -
> -# Test suite adapter for ITemporaryDatabase.
> -
> -def _cleanup(self, store, tempdb_name):
> - from mailman.config import config
> - store.rollback()
> - store.close()
> - # From the original database connection, drop the now unused database.
> - config.db.store.execute('DROP DATABASE {0}'.format(tempdb_name))
> -
> -
> -def make_temporary(database):
> - """Adapts by monkey patching an existing PostgreSQL IDatabase."""
> - from mailman.config import config
> - parts = urlsplit(config.database.url)
> - assert parts.scheme == 'postgres'
> - new_parts = list(parts)
> - new_parts[2] = '/mmtest'
> - url = urlunsplit(new_parts)
> - # Use the existing database connection to create a new testing
> - # database.
> - config.db.store.execute('ABORT;')
> - config.db.store.execute('CREATE DATABASE mmtest;')
> - with configuration('database', url=url):
> - database.initialize()
> - database._cleanup = types.MethodType(
> - partial(_cleanup, store=database.store, tempdb_name='mmtest'),
> - database)
> - # bool column values in PostgreSQL.
> - database.FALSE = 'False'
> - database.TRUE = 'True'
> - return database
> + """.format(table))
>
> === removed directory 'src/mailman/database/schema'
> === removed file 'src/mailman/database/schema/__init__.py'
> === removed file 'src/mailman/database/schema/helpers.py'
> --- src/mailman/database/schema/helpers.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/database/schema/helpers.py 1970-01-01 00:00:00 +0000
> @@ -1,43 +0,0 @@
> -# Copyright (C) 2013-2014 by the Free Software Foundation, Inc.
> -#
> -# This file is part of GNU Mailman.
> -#
> -# GNU Mailman is free software: you can redistribute it and/or modify it under
> -# the terms of the GNU General Public License as published by the Free
> -# Software Foundation, either version 3 of the License, or (at your option)
> -# any later version.
> -#
> -# GNU Mailman is distributed in the hope that it will be useful, but WITHOUT
> -# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
> -# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
> -# more details.
> -#
> -# You should have received a copy of the GNU General Public License along with
> -# GNU Mailman. If not, see <http://www.gnu.org/licenses/>.
> -
> -"""Schema migration helpers."""
> -
> -from __future__ import absolute_import, print_function, unicode_literals
> -
> -__metaclass__ = type
> -__all__ = [
> - 'make_listid',
> - ]
> -
> -
> -
> -def make_listid(fqdn_listname):
> - """Turn a FQDN list name into a List-ID."""
> - list_name, at, mail_host = fqdn_listname.partition('@')
> - if at == '':
> - # If there is no @ sign in the value, assume it already contains the
> - # list-id.
> - return fqdn_listname
> - return '{0}.{1}'.format(list_name, mail_host)
> -
> -
> -
> -def pivot(store, table_name):
> - """Pivot a backup table into the real table name."""
> - store.execute('DROP TABLE {}'.format(table_name))
> - store.execute('ALTER TABLE {0}_backup RENAME TO {0}'.format(table_name))
>
> === removed file 'src/mailman/database/schema/mm_00000000000000_base.py'
> --- src/mailman/database/schema/mm_00000000000000_base.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/database/schema/mm_00000000000000_base.py 1970-01-01 00:00:00 +0000
> @@ -1,35 +0,0 @@
> -# Copyright (C) 2012-2014 by the Free Software Foundation, Inc.
> -#
> -# This file is part of GNU Mailman.
> -#
> -# GNU Mailman is free software: you can redistribute it and/or modify it under
> -# the terms of the GNU General Public License as published by the Free
> -# Software Foundation, either version 3 of the License, or (at your option)
> -# any later version.
> -#
> -# GNU Mailman is distributed in the hope that it will be useful, but WITHOUT
> -# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
> -# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
> -# more details.
> -#
> -# You should have received a copy of the GNU General Public License along with
> -# GNU Mailman. If not, see <http://www.gnu.org/licenses/>.
> -
> -"""Load the base schema."""
> -
> -from __future__ import absolute_import, print_function, unicode_literals
> -
> -__metaclass__ = type
> -__all__ = [
> - 'upgrade',
> - ]
> -
> -
> -VERSION = '00000000000000'
> -_helper = None
> -
> -
> -
> -def upgrade(database, store, version, module_path):
> - filename = '{0}.sql'.format(database.TAG)
> - database.load_schema(store, version, filename, module_path)
>
> === removed file 'src/mailman/database/schema/mm_20120407000000.py'
> --- src/mailman/database/schema/mm_20120407000000.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/database/schema/mm_20120407000000.py 1970-01-01 00:00:00 +0000
> @@ -1,212 +0,0 @@
> -# Copyright (C) 2012-2014 by the Free Software Foundation, Inc.
> -#
> -# This file is part of GNU Mailman.
> -#
> -# GNU Mailman is free software: you can redistribute it and/or modify it under
> -# the terms of the GNU General Public License as published by the Free
> -# Software Foundation, either version 3 of the License, or (at your option)
> -# any later version.
> -#
> -# GNU Mailman is distributed in the hope that it will be useful, but WITHOUT
> -# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
> -# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
> -# more details.
> -#
> -# You should have received a copy of the GNU General Public License along with
> -# GNU Mailman. If not, see <http://www.gnu.org/licenses/>.
> -
> -"""3.0b1 -> 3.0b2 schema migrations.
> -
> -All column changes are in the `mailinglist` table.
> -
> -* Renames:
> - - news_prefix_subject_too -> nntp_prefix_subject_too
> - - news_moderation -> newsgroup_moderation
> -
> -* Collapsing:
> - - archive, archive_private -> archive_policy
> -
> -* Remove:
> - - archive_volume_frequency
> - - generic_nonmember_action
> - - nntp_host
> -
> -* Added:
> - - list_id
> -
> -* Changes:
> - member.mailing_list holds the list_id not the fqdn_listname
> -
> -See https://bugs.launchpad.net/mailman/+bug/971013 for details.
> -"""
> -
> -from __future__ import absolute_import, print_function, unicode_literals
> -
> -__metaclass__ = type
> -__all__ = [
> - 'upgrade',
> - ]
> -
> -
> -from mailman.database.schema.helpers import pivot
> -from mailman.interfaces.archiver import ArchivePolicy
> -
> -
> -VERSION = '20120407000000'
> -
> -
> -
> -def upgrade(database, store, version, module_path):
> - if database.TAG == 'sqlite':
> - upgrade_sqlite(database, store, version, module_path)
> - else:
> - upgrade_postgres(database, store, version, module_path)
> -
> -
> -
> -def archive_policy(archive, archive_private):
> - """Convert archive and archive_private to archive_policy."""
> - if archive == 0:
> - return ArchivePolicy.never.value
> - elif archive_private == 1:
> - return ArchivePolicy.private.value
> - else:
> - return ArchivePolicy.public.value
> -
> -
> -
> -def upgrade_sqlite(database, store, version, module_path):
> - # Load the first part of the migration. This creates a temporary table to
> - # hold the new mailinglist table columns. The problem is that some of the
> - # changes must be performed in Python, so after the first part is loaded,
> - # we do the Python changes, drop the old mailing list table, and then
> - # rename the temporary table to its place.
> - database.load_schema(
> - store, version, 'sqlite_{0}_01.sql'.format(version), module_path)
> - results = store.execute("""
> - SELECT id, include_list_post_header,
> - news_prefix_subject_too, news_moderation,
> - archive, archive_private, list_name, mail_host
> - FROM mailinglist;
> - """)
> - for value in results:
> - (id, list_post,
> - news_prefix, news_moderation,
> - archive, archive_private,
> - list_name, mail_host) = value
> - # Figure out what the new archive_policy column value should be.
> - list_id = '{0}.{1}'.format(list_name, mail_host)
> - fqdn_listname = '{0}@{1}'.format(list_name, mail_host)
> - store.execute("""
> - UPDATE mailinglist_backup SET
> - allow_list_posts = {0},
> - newsgroup_moderation = {1},
> - nntp_prefix_subject_too = {2},
> - archive_policy = {3},
> - list_id = '{4}'
> - WHERE id = {5};
> - """.format(
> - list_post,
> - news_moderation,
> - news_prefix,
> - archive_policy(archive, archive_private),
> - list_id,
> - id))
> - # Also update the member.mailing_list column to hold the list_id
> - # instead of the fqdn_listname.
> - store.execute("""
> - UPDATE member SET
> - mailing_list = '{0}'
> - WHERE mailing_list = '{1}';
> - """.format(list_id, fqdn_listname))
> - # Pivot the backup table to the real thing.
> - pivot(store, 'mailinglist')
> - # Now add some indexes that were previously missing.
> - store.execute(
> - 'CREATE INDEX ix_mailinglist_list_id ON mailinglist (list_id);')
> - store.execute(
> - 'CREATE INDEX ix_mailinglist_fqdn_listname '
> - 'ON mailinglist (list_name, mail_host);')
> - # Now, do the member table.
> - results = store.execute('SELECT id, mailing_list FROM member;')
> - for id, mailing_list in results:
> - list_name, at, mail_host = mailing_list.partition('@')
> - if at == '':
> - list_id = mailing_list
> - else:
> - list_id = '{0}.{1}'.format(list_name, mail_host)
> - store.execute("""
> - UPDATE member_backup SET list_id = '{0}'
> - WHERE id = {1};
> - """.format(list_id, id))
> - # Pivot the backup table to the real thing.
> - pivot(store, 'member')
> -
> -
> -
> -def upgrade_postgres(database, store, version, module_path):
> - # Get the old values from the mailinglist table.
> - results = store.execute("""
> - SELECT id, archive, archive_private, list_name, mail_host
> - FROM mailinglist;
> - """)
> - # Do the simple renames first.
> - store.execute("""
> - ALTER TABLE mailinglist
> - RENAME COLUMN news_prefix_subject_too TO nntp_prefix_subject_too;
> - """)
> - store.execute("""
> - ALTER TABLE mailinglist
> - RENAME COLUMN news_moderation TO newsgroup_moderation;
> - """)
> - store.execute("""
> - ALTER TABLE mailinglist
> - RENAME COLUMN include_list_post_header TO allow_list_posts;
> - """)
> - # Do the easy column drops next.
> - for column in ('archive_volume_frequency',
> - 'generic_nonmember_action',
> - 'nntp_host'):
> - store.execute(
> - 'ALTER TABLE mailinglist DROP COLUMN {0};'.format(column))
> - # Now do the trickier collapsing of values. Add the new columns.
> - store.execute('ALTER TABLE mailinglist ADD COLUMN archive_policy INTEGER;')
> - store.execute('ALTER TABLE mailinglist ADD COLUMN list_id TEXT;')
> - # Query the database for the old values of archive and archive_private in
> - # each column. Then loop through all the results and update the new
> - # archive_policy from the old values.
> - for value in results:
> - id, archive, archive_private, list_name, mail_host = value
> - list_id = '{0}.{1}'.format(list_name, mail_host)
> - store.execute("""
> - UPDATE mailinglist SET
> - archive_policy = {0},
> - list_id = '{1}'
> - WHERE id = {2};
> - """.format(archive_policy(archive, archive_private), list_id, id))
> - # Now drop the old columns.
> - for column in ('archive', 'archive_private'):
> - store.execute(
> - 'ALTER TABLE mailinglist DROP COLUMN {0};'.format(column))
> - # Now add some indexes that were previously missing.
> - store.execute(
> - 'CREATE INDEX ix_mailinglist_list_id ON mailinglist (list_id);')
> - store.execute(
> - 'CREATE INDEX ix_mailinglist_fqdn_listname '
> - 'ON mailinglist (list_name, mail_host);')
> - # Now, do the member table.
> - results = store.execute('SELECT id, mailing_list FROM member;')
> - store.execute('ALTER TABLE member ADD COLUMN list_id TEXT;')
> - for id, mailing_list in results:
> - list_name, at, mail_host = mailing_list.partition('@')
> - if at == '':
> - list_id = mailing_list
> - else:
> - list_id = '{0}.{1}'.format(list_name, mail_host)
> - store.execute("""
> - UPDATE member SET list_id = '{0}'
> - WHERE id = {1};
> - """.format(list_id, id))
> - store.execute('ALTER TABLE member DROP COLUMN mailing_list;')
> - # Record the migration in the version table.
> - database.load_schema(store, version, None, module_path)
>
> === removed file 'src/mailman/database/schema/mm_20121015000000.py'
> --- src/mailman/database/schema/mm_20121015000000.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/database/schema/mm_20121015000000.py 1970-01-01 00:00:00 +0000
> @@ -1,95 +0,0 @@
> -# Copyright (C) 2012-2014 by the Free Software Foundation, Inc.
> -#
> -# This file is part of GNU Mailman.
> -#
> -# GNU Mailman is free software: you can redistribute it and/or modify it under
> -# the terms of the GNU General Public License as published by the Free
> -# Software Foundation, either version 3 of the License, or (at your option)
> -# any later version.
> -#
> -# GNU Mailman is distributed in the hope that it will be useful, but WITHOUT
> -# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
> -# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
> -# more details.
> -#
> -# You should have received a copy of the GNU General Public License along with
> -# GNU Mailman. If not, see <http://www.gnu.org/licenses/>.
> -
> -"""3.0b2 -> 3.0b3 schema migrations.
> -
> -Renamed:
> - * bans.mailing_list -> bans.list_id
> -
> -Removed:
> - * mailinglist.new_member_options
> - * mailinglist.send_remindersn
> -"""
> -
> -from __future__ import absolute_import, print_function, unicode_literals
> -
> -__metaclass__ = type
> -__all__ = [
> - 'upgrade',
> - ]
> -
> -
> -from mailman.database.schema.helpers import make_listid, pivot
> -
> -
> -VERSION = '20121015000000'
> -
> -
> -
> -def upgrade(database, store, version, module_path):
> - if database.TAG == 'sqlite':
> - upgrade_sqlite(database, store, version, module_path)
> - else:
> - upgrade_postgres(database, store, version, module_path)
> -
> -
> -
> -def upgrade_sqlite(database, store, version, module_path):
> - database.load_schema(
> - store, version, 'sqlite_{}_01.sql'.format(version), module_path)
> - results = store.execute("""
> - SELECT id, mailing_list
> - FROM ban;
> - """)
> - for id, mailing_list in results:
> - # Skip global bans since there's nothing to update.
> - if mailing_list is None:
> - continue
> - store.execute("""
> - UPDATE ban_backup SET list_id = '{}'
> - WHERE id = {};
> - """.format(make_listid(mailing_list), id))
> - # Pivot the bans backup table to the real thing.
> - pivot(store, 'ban')
> - pivot(store, 'mailinglist')
> -
> -
> -
> -def upgrade_postgres(database, store, version, module_path):
> - # Get the old values from the ban table.
> - results = store.execute('SELECT id, mailing_list FROM ban;')
> - store.execute('ALTER TABLE ban ADD COLUMN list_id TEXT;')
> - for id, mailing_list in results:
> - # Skip global bans since there's nothing to update.
> - if mailing_list is None:
> - continue
> - store.execute("""
> - UPDATE ban SET list_id = '{0}'
> - WHERE id = {1};
> - """.format(make_listid(mailing_list), id))
> - store.execute('ALTER TABLE ban DROP COLUMN mailing_list;')
> - store.execute('ALTER TABLE mailinglist DROP COLUMN new_member_options;')
> - store.execute('ALTER TABLE mailinglist DROP COLUMN send_reminders;')
> - store.execute('ALTER TABLE mailinglist DROP COLUMN subscribe_policy;')
> - store.execute('ALTER TABLE mailinglist DROP COLUMN unsubscribe_policy;')
> - store.execute(
> - 'ALTER TABLE mailinglist DROP COLUMN subscribe_auto_approval;')
> - store.execute('ALTER TABLE mailinglist DROP COLUMN private_roster;')
> - store.execute(
> - 'ALTER TABLE mailinglist DROP COLUMN admin_member_chunksize;')
> - # Record the migration in the version table.
> - database.load_schema(store, version, None, module_path)
>
> === removed file 'src/mailman/database/schema/mm_20130406000000.py'
> --- src/mailman/database/schema/mm_20130406000000.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/database/schema/mm_20130406000000.py 1970-01-01 00:00:00 +0000
> @@ -1,65 +0,0 @@
> -# Copyright (C) 2013-2014 by the Free Software Foundation, Inc.
> -#
> -# This file is part of GNU Mailman.
> -#
> -# GNU Mailman is free software: you can redistribute it and/or modify it under
> -# the terms of the GNU General Public License as published by the Free
> -# Software Foundation, either version 3 of the License, or (at your option)
> -# any later version.
> -#
> -# GNU Mailman is distributed in the hope that it will be useful, but WITHOUT
> -# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
> -# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
> -# more details.
> -#
> -# You should have received a copy of the GNU General Public License along with
> -# GNU Mailman. If not, see <http://www.gnu.org/licenses/>.
> -
> -"""3.0b3 -> 3.0b4 schema migrations.
> -
> -Renamed:
> - * bounceevent.list_name -> bounceevent.list_id
> -"""
> -
> -
> -from __future__ import absolute_import, print_function, unicode_literals
> -
> -__metaclass__ = type
> -__all__ = [
> - 'upgrade'
> - ]
> -
> -
> -from mailman.database.schema.helpers import make_listid, pivot
> -
> -
> -VERSION = '20130406000000'
> -
> -
> -
> -def upgrade(database, store, version, module_path):
> - if database.TAG == 'sqlite':
> - upgrade_sqlite(database, store, version, module_path)
> - else:
> - upgrade_postgres(database, store, version, module_path)
> -
> -
> -
> -def upgrade_sqlite(database, store, version, module_path):
> - database.load_schema(
> - store, version, 'sqlite_{}_01.sql'.format(version), module_path)
> - results = store.execute("""
> - SELECT id, list_name
> - FROM bounceevent;
> - """)
> - for id, list_name in results:
> - store.execute("""
> - UPDATE bounceevent_backup SET list_id = '{}'
> - WHERE id = {};
> - """.format(make_listid(list_name), id))
> - pivot(store, 'bounceevent')
> -
> -
> -
> -def upgrade_postgres(database, store, version, module_path):
> - pass
>
> === removed file 'src/mailman/database/schema/postgres.sql'
> --- src/mailman/database/schema/postgres.sql 2012-07-23 14:40:53 +0000
> +++ src/mailman/database/schema/postgres.sql 1970-01-01 00:00:00 +0000
> @@ -1,349 +0,0 @@
> -CREATE TABLE mailinglist (
> - id SERIAL NOT NULL,
> - -- List identity
> - list_name TEXT,
> - mail_host TEXT,
> - include_list_post_header BOOLEAN,
> - include_rfc2369_headers BOOLEAN,
> - -- Attributes not directly modifiable via the web u/i
> - created_at TIMESTAMP,
> - admin_member_chunksize INTEGER,
> - next_request_id INTEGER,
> - next_digest_number INTEGER,
> - digest_last_sent_at TIMESTAMP,
> - volume INTEGER,
> - last_post_at TIMESTAMP,
> - accept_these_nonmembers BYTEA,
> - acceptable_aliases_id INTEGER,
> - admin_immed_notify BOOLEAN,
> - admin_notify_mchanges BOOLEAN,
> - administrivia BOOLEAN,
> - advertised BOOLEAN,
> - anonymous_list BOOLEAN,
> - archive BOOLEAN,
> - archive_private BOOLEAN,
> - archive_volume_frequency INTEGER,
> - -- Automatic responses.
> - autorespond_owner INTEGER,
> - autoresponse_owner_text TEXT,
> - autorespond_postings INTEGER,
> - autoresponse_postings_text TEXT,
> - autorespond_requests INTEGER,
> - autoresponse_request_text TEXT,
> - autoresponse_grace_period TEXT,
> - -- Bounces.
> - forward_unrecognized_bounces_to INTEGER,
> - process_bounces BOOLEAN,
> - bounce_info_stale_after TEXT,
> - bounce_matching_headers TEXT,
> - bounce_notify_owner_on_disable BOOLEAN,
> - bounce_notify_owner_on_removal BOOLEAN,
> - bounce_score_threshold INTEGER,
> - bounce_you_are_disabled_warnings INTEGER,
> - bounce_you_are_disabled_warnings_interval TEXT,
> - -- Content filtering.
> - filter_action INTEGER,
> - filter_content BOOLEAN,
> - collapse_alternatives BOOLEAN,
> - convert_html_to_plaintext BOOLEAN,
> - default_member_action INTEGER,
> - default_nonmember_action INTEGER,
> - description TEXT,
> - digest_footer_uri TEXT,
> - digest_header_uri TEXT,
> - digest_is_default BOOLEAN,
> - digest_send_periodic BOOLEAN,
> - digest_size_threshold REAL,
> - digest_volume_frequency INTEGER,
> - digestable BOOLEAN,
> - discard_these_nonmembers BYTEA,
> - emergency BOOLEAN,
> - encode_ascii_prefixes BOOLEAN,
> - first_strip_reply_to BOOLEAN,
> - footer_uri TEXT,
> - forward_auto_discards BOOLEAN,
> - gateway_to_mail BOOLEAN,
> - gateway_to_news BOOLEAN,
> - generic_nonmember_action INTEGER,
> - goodbye_message_uri TEXT,
> - header_matches BYTEA,
> - header_uri TEXT,
> - hold_these_nonmembers BYTEA,
> - info TEXT,
> - linked_newsgroup TEXT,
> - max_days_to_hold INTEGER,
> - max_message_size INTEGER,
> - max_num_recipients INTEGER,
> - member_moderation_notice TEXT,
> - mime_is_default_digest BOOLEAN,
> - moderator_password TEXT,
> - new_member_options INTEGER,
> - news_moderation INTEGER,
> - news_prefix_subject_too BOOLEAN,
> - nntp_host TEXT,
> - nondigestable BOOLEAN,
> - nonmember_rejection_notice TEXT,
> - obscure_addresses BOOLEAN,
> - owner_chain TEXT,
> - owner_pipeline TEXT,
> - personalize INTEGER,
> - post_id INTEGER,
> - posting_chain TEXT,
> - posting_pipeline TEXT,
> - preferred_language TEXT,
> - private_roster BOOLEAN,
> - display_name TEXT,
> - reject_these_nonmembers BYTEA,
> - reply_goes_to_list INTEGER,
> - reply_to_address TEXT,
> - require_explicit_destination BOOLEAN,
> - respond_to_post_requests BOOLEAN,
> - scrub_nondigest BOOLEAN,
> - send_goodbye_message BOOLEAN,
> - send_reminders BOOLEAN,
> - send_welcome_message BOOLEAN,
> - subject_prefix TEXT,
> - subscribe_auto_approval BYTEA,
> - subscribe_policy INTEGER,
> - topics BYTEA,
> - topics_bodylines_limit INTEGER,
> - topics_enabled BOOLEAN,
> - unsubscribe_policy INTEGER,
> - welcome_message_uri TEXT,
> - -- This was accidentally added by the PostgreSQL porter.
> - -- moderation_callback TEXT,
> - PRIMARY KEY (id)
> - );
> -
> -CREATE TABLE _request (
> - id SERIAL NOT NULL,
> - "key" TEXT,
> - request_type INTEGER,
> - data_hash BYTEA,
> - mailing_list_id INTEGER,
> - PRIMARY KEY (id)
> - -- XXX: config.db_reset() triggers IntegrityError
> - -- ,
> - -- CONSTRAINT _request_mailing_list_id_fk
> - -- FOREIGN KEY (mailing_list_id) REFERENCES mailinglist (id)
> - );
> -
> -CREATE TABLE acceptablealias (
> - id SERIAL NOT NULL,
> - "alias" TEXT NOT NULL,
> - mailing_list_id INTEGER NOT NULL,
> - PRIMARY KEY (id)
> - -- XXX: config.db_reset() triggers IntegrityError
> - -- ,
> - -- CONSTRAINT acceptablealias_mailing_list_id_fk
> - -- FOREIGN KEY (mailing_list_id) REFERENCES mailinglist (id)
> - );
> -CREATE INDEX ix_acceptablealias_mailing_list_id
> - ON acceptablealias (mailing_list_id);
> -CREATE INDEX ix_acceptablealias_alias ON acceptablealias ("alias");
> -
> -CREATE TABLE preferences (
> - id SERIAL NOT NULL,
> - acknowledge_posts BOOLEAN,
> - hide_address BOOLEAN,
> - preferred_language TEXT,
> - receive_list_copy BOOLEAN,
> - receive_own_postings BOOLEAN,
> - delivery_mode INTEGER,
> - delivery_status INTEGER,
> - PRIMARY KEY (id)
> - );
> -
> -CREATE TABLE address (
> - id SERIAL NOT NULL,
> - email TEXT,
> - _original TEXT,
> - display_name TEXT,
> - verified_on TIMESTAMP,
> - registered_on TIMESTAMP,
> - user_id INTEGER,
> - preferences_id INTEGER,
> - PRIMARY KEY (id)
> - -- XXX: config.db_reset() triggers IntegrityError
> - -- ,
> - -- CONSTRAINT address_preferences_id_fk
> - -- FOREIGN KEY (preferences_id) REFERENCES preferences (id)
> - );
> -
> -CREATE TABLE "user" (
> - id SERIAL NOT NULL,
> - display_name TEXT,
> - password BYTEA,
> - _user_id UUID,
> - _created_on TIMESTAMP,
> - _preferred_address_id INTEGER,
> - preferences_id INTEGER,
> - PRIMARY KEY (id)
> - -- XXX: config.db_reset() triggers IntegrityError
> - -- ,
> - -- CONSTRAINT user_preferences_id_fk
> - -- FOREIGN KEY (preferences_id) REFERENCES preferences (id),
> - -- XXX: config.db_reset() triggers IntegrityError
> - -- CONSTRAINT _preferred_address_id_fk
> - -- FOREIGN KEY (_preferred_address_id) REFERENCES address (id)
> - );
> -CREATE INDEX ix_user_user_id ON "user" (_user_id);
> -
> --- since user and address have circular foreign key refs, the
> --- constraint on the address table has to be added after
> --- the user table is created
> ---
> --- XXX: users.rst triggers an IntegrityError
> --- ALTER TABLE address ADD
> --- CONSTRAINT address_user_id_fk
> --- FOREIGN KEY (user_id) REFERENCES "user" (id);
> -
> -CREATE TABLE autoresponserecord (
> - id SERIAL NOT NULL,
> - address_id INTEGER,
> - mailing_list_id INTEGER,
> - response_type INTEGER,
> - date_sent TIMESTAMP,
> - PRIMARY KEY (id)
> - -- XXX: config.db_reset() triggers IntegrityError
> - -- ,
> - -- CONSTRAINT autoresponserecord_address_id_fk
> - -- FOREIGN KEY (address_id) REFERENCES address (id)
> - -- XXX: config.db_reset() triggers IntegrityError
> - -- ,
> - -- CONSTRAINT autoresponserecord_mailing_list_id
> - -- FOREIGN KEY (mailing_list_id) REFERENCES mailinglist (id)
> - );
> -CREATE INDEX ix_autoresponserecord_address_id
> - ON autoresponserecord (address_id);
> -CREATE INDEX ix_autoresponserecord_mailing_list_id
> - ON autoresponserecord (mailing_list_id);
> -
> -CREATE TABLE bounceevent (
> - id SERIAL NOT NULL,
> - list_name TEXT,
> - email TEXT,
> - "timestamp" TIMESTAMP,
> - message_id TEXT,
> - context INTEGER,
> - processed BOOLEAN,
> - PRIMARY KEY (id)
> - );
> -
> -CREATE TABLE contentfilter (
> - id SERIAL NOT NULL,
> - mailing_list_id INTEGER,
> - filter_pattern TEXT,
> - filter_type INTEGER,
> - PRIMARY KEY (id),
> - CONSTRAINT contentfilter_mailing_list_id
> - FOREIGN KEY (mailing_list_id) REFERENCES mailinglist (id)
> - );
> -CREATE INDEX ix_contentfilter_mailing_list_id
> - ON contentfilter (mailing_list_id);
> -
> -CREATE TABLE domain (
> - id SERIAL NOT NULL,
> - mail_host TEXT,
> - base_url TEXT,
> - description TEXT,
> - contact_address TEXT,
> - PRIMARY KEY (id)
> - );
> -
> -CREATE TABLE language (
> - id SERIAL NOT NULL,
> - code TEXT,
> - PRIMARY KEY (id)
> - );
> -
> -CREATE TABLE member (
> - id SERIAL NOT NULL,
> - _member_id UUID,
> - role INTEGER,
> - mailing_list TEXT,
> - moderation_action INTEGER,
> - address_id INTEGER,
> - preferences_id INTEGER,
> - user_id INTEGER,
> - PRIMARY KEY (id)
> - -- XXX: config.db_reset() triggers IntegrityError
> - -- ,
> - -- CONSTRAINT member_address_id_fk
> - -- FOREIGN KEY (address_id) REFERENCES address (id),
> - -- XXX: config.db_reset() triggers IntegrityError
> - -- CONSTRAINT member_preferences_id_fk
> - -- FOREIGN KEY (preferences_id) REFERENCES preferences (id),
> - -- CONSTRAINT member_user_id_fk
> - -- FOREIGN KEY (user_id) REFERENCES "user" (id)
> - );
> -CREATE INDEX ix_member__member_id ON member (_member_id);
> -CREATE INDEX ix_member_address_id ON member (address_id);
> -CREATE INDEX ix_member_preferences_id ON member (preferences_id);
> -
> -CREATE TABLE message (
> - id SERIAL NOT NULL,
> - message_id_hash BYTEA,
> - path BYTEA,
> - message_id TEXT,
> - PRIMARY KEY (id)
> - );
> -
> -CREATE TABLE onelastdigest (
> - id SERIAL NOT NULL,
> - mailing_list_id INTEGER,
> - address_id INTEGER,
> - delivery_mode INTEGER,
> - PRIMARY KEY (id),
> - CONSTRAINT onelastdigest_mailing_list_id_fk
> - FOREIGN KEY (mailing_list_id) REFERENCES mailinglist(id),
> - CONSTRAINT onelastdigest_address_id_fk
> - FOREIGN KEY (address_id) REFERENCES address(id)
> - );
> -
> -CREATE TABLE pended (
> - id SERIAL NOT NULL,
> - token BYTEA,
> - expiration_date TIMESTAMP,
> - PRIMARY KEY (id)
> - );
> -
> -CREATE TABLE pendedkeyvalue (
> - id SERIAL NOT NULL,
> - "key" TEXT,
> - value TEXT,
> - pended_id INTEGER,
> - PRIMARY KEY (id)
> - -- ,
> - -- XXX: config.db_reset() triggers IntegrityError
> - -- CONSTRAINT pendedkeyvalue_pended_id_fk
> - -- FOREIGN KEY (pended_id) REFERENCES pended (id)
> - );
> -
> -CREATE TABLE version (
> - id SERIAL NOT NULL,
> - component TEXT,
> - version TEXT,
> - PRIMARY KEY (id)
> - );
> -
> -CREATE INDEX ix__request_mailing_list_id ON _request (mailing_list_id);
> -CREATE INDEX ix_address_preferences_id ON address (preferences_id);
> -CREATE INDEX ix_address_user_id ON address (user_id);
> -CREATE INDEX ix_pendedkeyvalue_pended_id ON pendedkeyvalue (pended_id);
> -CREATE INDEX ix_user_preferences_id ON "user" (preferences_id);
> -
> -CREATE TABLE ban (
> - id SERIAL NOT NULL,
> - email TEXT,
> - mailing_list TEXT,
> - PRIMARY KEY (id)
> - );
> -
> -CREATE TABLE uid (
> - -- Keep track of all assigned unique ids to prevent re-use.
> - id SERIAL NOT NULL,
> - uid UUID,
> - PRIMARY KEY (id)
> - );
> -CREATE INDEX ix_uid_uid ON uid (uid);
>
> === removed file 'src/mailman/database/schema/sqlite.sql'
> --- src/mailman/database/schema/sqlite.sql 2012-04-08 16:15:29 +0000
> +++ src/mailman/database/schema/sqlite.sql 1970-01-01 00:00:00 +0000
> @@ -1,327 +0,0 @@
> --- THIS FILE HAS BEEN FROZEN AS OF 3.0b1
> --- SEE THE SCHEMA MIGRATIONS FOR DIFFERENCES.
> -
> -PRAGMA foreign_keys = ON;
> -
> -CREATE TABLE _request (
> - id INTEGER NOT NULL,
> - "key" TEXT,
> - request_type INTEGER,
> - data_hash TEXT,
> - mailing_list_id INTEGER,
> - PRIMARY KEY (id),
> - CONSTRAINT _request_mailing_list_id_fk
> - FOREIGN KEY (mailing_list_id) REFERENCES mailinglist (id)
> - );
> -
> -CREATE TABLE acceptablealias (
> - id INTEGER NOT NULL,
> - "alias" TEXT NOT NULL,
> - mailing_list_id INTEGER NOT NULL,
> - PRIMARY KEY (id),
> - CONSTRAINT acceptablealias_mailing_list_id_fk
> - FOREIGN KEY (mailing_list_id) REFERENCES mailinglist (id)
> - );
> -CREATE INDEX ix_acceptablealias_mailing_list_id
> - ON acceptablealias (mailing_list_id);
> -CREATE INDEX ix_acceptablealias_alias ON acceptablealias ("alias");
> -
> -CREATE TABLE address (
> - id INTEGER NOT NULL,
> - email TEXT,
> - _original TEXT,
> - display_name TEXT,
> - verified_on TIMESTAMP,
> - registered_on TIMESTAMP,
> - user_id INTEGER,
> - preferences_id INTEGER,
> - PRIMARY KEY (id),
> - CONSTRAINT address_user_id_fk
> - FOREIGN KEY (user_id) REFERENCES user (id),
> - CONSTRAINT address_preferences_id_fk
> - FOREIGN KEY (preferences_id) REFERENCES preferences (id)
> - );
> -
> -CREATE TABLE autoresponserecord (
> - id INTEGER NOT NULL,
> - address_id INTEGER,
> - mailing_list_id INTEGER,
> - response_type INTEGER,
> - date_sent TIMESTAMP,
> - PRIMARY KEY (id),
> - CONSTRAINT autoresponserecord_address_id_fk
> - FOREIGN KEY (address_id) REFERENCES address (id),
> - CONSTRAINT autoresponserecord_mailing_list_id
> - FOREIGN KEY (mailing_list_id) REFERENCES mailinglist (id)
> - );
> -CREATE INDEX ix_autoresponserecord_address_id
> - ON autoresponserecord (address_id);
> -CREATE INDEX ix_autoresponserecord_mailing_list_id
> - ON autoresponserecord (mailing_list_id);
> -
> -CREATE TABLE bounceevent (
> - id INTEGER NOT NULL,
> - list_name TEXT,
> - email TEXT,
> - 'timestamp' TIMESTAMP,
> - message_id TEXT,
> - context INTEGER,
> - processed BOOLEAN,
> - PRIMARY KEY (id)
> - );
> -
> -CREATE TABLE contentfilter (
> - id INTEGER NOT NULL,
> - mailing_list_id INTEGER,
> - filter_pattern TEXT,
> - filter_type INTEGER,
> - PRIMARY KEY (id),
> - CONSTRAINT contentfilter_mailing_list_id
> - FOREIGN KEY (mailing_list_id) REFERENCES mailinglist (id)
> - );
> -CREATE INDEX ix_contentfilter_mailing_list_id
> - ON contentfilter (mailing_list_id);
> -
> -CREATE TABLE domain (
> - id INTEGER NOT NULL,
> - mail_host TEXT,
> - base_url TEXT,
> - description TEXT,
> - contact_address TEXT,
> - PRIMARY KEY (id)
> - );
> -
> -CREATE TABLE language (
> - id INTEGER NOT NULL,
> - code TEXT,
> - PRIMARY KEY (id)
> - );
> -
> -CREATE TABLE mailinglist (
> - id INTEGER NOT NULL,
> - -- List identity
> - list_name TEXT,
> - mail_host TEXT,
> - include_list_post_header BOOLEAN,
> - include_rfc2369_headers BOOLEAN,
> - -- Attributes not directly modifiable via the web u/i
> - created_at TIMESTAMP,
> - admin_member_chunksize INTEGER,
> - next_request_id INTEGER,
> - next_digest_number INTEGER,
> - digest_last_sent_at TIMESTAMP,
> - volume INTEGER,
> - last_post_at TIMESTAMP,
> - accept_these_nonmembers BLOB,
> - acceptable_aliases_id INTEGER,
> - admin_immed_notify BOOLEAN,
> - admin_notify_mchanges BOOLEAN,
> - administrivia BOOLEAN,
> - advertised BOOLEAN,
> - anonymous_list BOOLEAN,
> - archive BOOLEAN,
> - archive_private BOOLEAN,
> - archive_volume_frequency INTEGER,
> - -- Automatic responses.
> - autorespond_owner INTEGER,
> - autoresponse_owner_text TEXT,
> - autorespond_postings INTEGER,
> - autoresponse_postings_text TEXT,
> - autorespond_requests INTEGER,
> - autoresponse_request_text TEXT,
> - autoresponse_grace_period TEXT,
> - -- Bounces.
> - forward_unrecognized_bounces_to INTEGER,
> - process_bounces BOOLEAN,
> - bounce_info_stale_after TEXT,
> - bounce_matching_headers TEXT,
> - bounce_notify_owner_on_disable BOOLEAN,
> - bounce_notify_owner_on_removal BOOLEAN,
> - bounce_score_threshold INTEGER,
> - bounce_you_are_disabled_warnings INTEGER,
> - bounce_you_are_disabled_warnings_interval TEXT,
> - -- Content filtering.
> - filter_action INTEGER,
> - filter_content BOOLEAN,
> - collapse_alternatives BOOLEAN,
> - convert_html_to_plaintext BOOLEAN,
> - default_member_action INTEGER,
> - default_nonmember_action INTEGER,
> - description TEXT,
> - digest_footer_uri TEXT,
> - digest_header_uri TEXT,
> - digest_is_default BOOLEAN,
> - digest_send_periodic BOOLEAN,
> - digest_size_threshold FLOAT,
> - digest_volume_frequency INTEGER,
> - digestable BOOLEAN,
> - discard_these_nonmembers BLOB,
> - emergency BOOLEAN,
> - encode_ascii_prefixes BOOLEAN,
> - first_strip_reply_to BOOLEAN,
> - footer_uri TEXT,
> - forward_auto_discards BOOLEAN,
> - gateway_to_mail BOOLEAN,
> - gateway_to_news BOOLEAN,
> - generic_nonmember_action INTEGER,
> - goodbye_message_uri TEXT,
> - header_matches BLOB,
> - header_uri TEXT,
> - hold_these_nonmembers BLOB,
> - info TEXT,
> - linked_newsgroup TEXT,
> - max_days_to_hold INTEGER,
> - max_message_size INTEGER,
> - max_num_recipients INTEGER,
> - member_moderation_notice TEXT,
> - mime_is_default_digest BOOLEAN,
> - moderator_password TEXT,
> - new_member_options INTEGER,
> - news_moderation INTEGER,
> - news_prefix_subject_too BOOLEAN,
> - nntp_host TEXT,
> - nondigestable BOOLEAN,
> - nonmember_rejection_notice TEXT,
> - obscure_addresses BOOLEAN,
> - owner_chain TEXT,
> - owner_pipeline TEXT,
> - personalize INTEGER,
> - post_id INTEGER,
> - posting_chain TEXT,
> - posting_pipeline TEXT,
> - preferred_language TEXT,
> - private_roster BOOLEAN,
> - display_name TEXT,
> - reject_these_nonmembers BLOB,
> - reply_goes_to_list INTEGER,
> - reply_to_address TEXT,
> - require_explicit_destination BOOLEAN,
> - respond_to_post_requests BOOLEAN,
> - scrub_nondigest BOOLEAN,
> - send_goodbye_message BOOLEAN,
> - send_reminders BOOLEAN,
> - send_welcome_message BOOLEAN,
> - subject_prefix TEXT,
> - subscribe_auto_approval BLOB,
> - subscribe_policy INTEGER,
> - topics BLOB,
> - topics_bodylines_limit INTEGER,
> - topics_enabled BOOLEAN,
> - unsubscribe_policy INTEGER,
> - welcome_message_uri TEXT,
> - PRIMARY KEY (id)
> - );
> -
> -CREATE TABLE member (
> - id INTEGER NOT NULL,
> - _member_id TEXT,
> - role INTEGER,
> - mailing_list TEXT,
> - moderation_action INTEGER,
> - address_id INTEGER,
> - preferences_id INTEGER,
> - user_id INTEGER,
> - PRIMARY KEY (id),
> - CONSTRAINT member_address_id_fk
> - FOREIGN KEY (address_id) REFERENCES address (id),
> - CONSTRAINT member_preferences_id_fk
> - FOREIGN KEY (preferences_id) REFERENCES preferences (id)
> - CONSTRAINT member_user_id_fk
> - FOREIGN KEY (user_id) REFERENCES user (id)
> - );
> -CREATE INDEX ix_member__member_id ON member (_member_id);
> -CREATE INDEX ix_member_address_id ON member (address_id);
> -CREATE INDEX ix_member_preferences_id ON member (preferences_id);
> -
> -CREATE TABLE message (
> - id INTEGER NOT NULL,
> - message_id_hash TEXT,
> - path TEXT,
> - message_id TEXT,
> - PRIMARY KEY (id)
> - );
> -
> -CREATE TABLE onelastdigest (
> - id INTEGER NOT NULL,
> - mailing_list_id INTEGER,
> - address_id INTEGER,
> - delivery_mode INTEGER,
> - PRIMARY KEY (id),
> - CONSTRAINT onelastdigest_mailing_list_id_fk
> - FOREIGN KEY (mailing_list_id) REFERENCES mailinglist(id),
> - CONSTRAINT onelastdigest_address_id_fk
> - FOREIGN KEY (address_id) REFERENCES address(id)
> - );
> -
> -CREATE TABLE pended (
> - id INTEGER NOT NULL,
> - token TEXT,
> - expiration_date TIMESTAMP,
> - PRIMARY KEY (id)
> - );
> -
> -CREATE TABLE pendedkeyvalue (
> - id INTEGER NOT NULL,
> - "key" TEXT,
> - value TEXT,
> - pended_id INTEGER,
> - PRIMARY KEY (id),
> - CONSTRAINT pendedkeyvalue_pended_id_fk
> - FOREIGN KEY (pended_id) REFERENCES pended (id)
> - );
> -
> -CREATE TABLE preferences (
> - id INTEGER NOT NULL,
> - acknowledge_posts BOOLEAN,
> - hide_address BOOLEAN,
> - preferred_language TEXT,
> - receive_list_copy BOOLEAN,
> - receive_own_postings BOOLEAN,
> - delivery_mode INTEGER,
> - delivery_status INTEGER,
> - PRIMARY KEY (id)
> - );
> -
> -CREATE TABLE user (
> - id INTEGER NOT NULL,
> - display_name TEXT,
> - password BINARY,
> - _user_id TEXT,
> - _created_on TIMESTAMP,
> - _preferred_address_id INTEGER,
> - preferences_id INTEGER,
> - PRIMARY KEY (id),
> - CONSTRAINT user_preferences_id_fk
> - FOREIGN KEY (preferences_id) REFERENCES preferences (id),
> - CONSTRAINT _preferred_address_id_fk
> - FOREIGN KEY (_preferred_address_id) REFERENCES address (id)
> - );
> -CREATE INDEX ix_user_user_id ON user (_user_id);
> -
> -CREATE TABLE version (
> - id INTEGER NOT NULL,
> - component TEXT,
> - version TEXT,
> - PRIMARY KEY (id)
> - );
> -
> -CREATE INDEX ix__request_mailing_list_id ON _request (mailing_list_id);
> -CREATE INDEX ix_address_preferences_id ON address (preferences_id);
> -CREATE INDEX ix_address_user_id ON address (user_id);
> -CREATE INDEX ix_pendedkeyvalue_pended_id ON pendedkeyvalue (pended_id);
> -CREATE INDEX ix_user_preferences_id ON user (preferences_id);
> -
> -CREATE TABLE ban (
> - id INTEGER NOT NULL,
> - email TEXT,
> - mailing_list TEXT,
> - PRIMARY KEY (id)
> - );
> -
> -CREATE TABLE uid (
> - -- Keep track of all assigned unique ids to prevent re-use.
> - id INTEGER NOT NULL,
> - uid TEXT,
> - PRIMARY KEY (id)
> - );
> -CREATE INDEX ix_uid_uid ON uid (uid);
>
> === removed file 'src/mailman/database/schema/sqlite_20120407000000_01.sql'
> --- src/mailman/database/schema/sqlite_20120407000000_01.sql 2013-09-01 15:15:08 +0000
> +++ src/mailman/database/schema/sqlite_20120407000000_01.sql 1970-01-01 00:00:00 +0000
> @@ -1,280 +0,0 @@
> --- This file contains the sqlite3 schema migration from
> --- 3.0b1 TO 3.0b2
> ---
> --- 3.0b2 has been released thus you MAY NOT edit this file.
> -
> --- For SQLite3 migration strategy, see
> --- http://sqlite.org/faq.html#q11
> -
> --- REMOVALS from the mailinglist table:
> --- REM archive
> --- REM archive_private
> --- REM archive_volume_frequency
> --- REM include_list_post_header
> --- REM news_moderation
> --- REM news_prefix_subject_too
> --- REM nntp_host
> ---
> --- ADDS to the mailing list table:
> --- ADD allow_list_posts
> --- ADD archive_policy
> --- ADD list_id
> --- ADD newsgroup_moderation
> --- ADD nntp_prefix_subject_too
> -
> --- LP: #971013
> --- LP: #967238
> -
> --- REMOVALS from the member table:
> --- REM mailing_list
> -
> --- ADDS to the member table:
> --- ADD list_id
> -
> --- LP: #1024509
> -
> -
> -CREATE TABLE mailinglist_backup (
> - id INTEGER NOT NULL,
> - -- List identity
> - list_name TEXT,
> - mail_host TEXT,
> - allow_list_posts BOOLEAN,
> - include_rfc2369_headers BOOLEAN,
> - -- Attributes not directly modifiable via the web u/i
> - created_at TIMESTAMP,
> - admin_member_chunksize INTEGER,
> - next_request_id INTEGER,
> - next_digest_number INTEGER,
> - digest_last_sent_at TIMESTAMP,
> - volume INTEGER,
> - last_post_at TIMESTAMP,
> - accept_these_nonmembers BLOB,
> - acceptable_aliases_id INTEGER,
> - admin_immed_notify BOOLEAN,
> - admin_notify_mchanges BOOLEAN,
> - administrivia BOOLEAN,
> - advertised BOOLEAN,
> - anonymous_list BOOLEAN,
> - -- Automatic responses.
> - autorespond_owner INTEGER,
> - autoresponse_owner_text TEXT,
> - autorespond_postings INTEGER,
> - autoresponse_postings_text TEXT,
> - autorespond_requests INTEGER,
> - autoresponse_request_text TEXT,
> - autoresponse_grace_period TEXT,
> - -- Bounces.
> - forward_unrecognized_bounces_to INTEGER,
> - process_bounces BOOLEAN,
> - bounce_info_stale_after TEXT,
> - bounce_matching_headers TEXT,
> - bounce_notify_owner_on_disable BOOLEAN,
> - bounce_notify_owner_on_removal BOOLEAN,
> - bounce_score_threshold INTEGER,
> - bounce_you_are_disabled_warnings INTEGER,
> - bounce_you_are_disabled_warnings_interval TEXT,
> - -- Content filtering.
> - filter_action INTEGER,
> - filter_content BOOLEAN,
> - collapse_alternatives BOOLEAN,
> - convert_html_to_plaintext BOOLEAN,
> - default_member_action INTEGER,
> - default_nonmember_action INTEGER,
> - description TEXT,
> - digest_footer_uri TEXT,
> - digest_header_uri TEXT,
> - digest_is_default BOOLEAN,
> - digest_send_periodic BOOLEAN,
> - digest_size_threshold FLOAT,
> - digest_volume_frequency INTEGER,
> - digestable BOOLEAN,
> - discard_these_nonmembers BLOB,
> - emergency BOOLEAN,
> - encode_ascii_prefixes BOOLEAN,
> - first_strip_reply_to BOOLEAN,
> - footer_uri TEXT,
> - forward_auto_discards BOOLEAN,
> - gateway_to_mail BOOLEAN,
> - gateway_to_news BOOLEAN,
> - goodbye_message_uri TEXT,
> - header_matches BLOB,
> - header_uri TEXT,
> - hold_these_nonmembers BLOB,
> - info TEXT,
> - linked_newsgroup TEXT,
> - max_days_to_hold INTEGER,
> - max_message_size INTEGER,
> - max_num_recipients INTEGER,
> - member_moderation_notice TEXT,
> - mime_is_default_digest BOOLEAN,
> - moderator_password TEXT,
> - new_member_options INTEGER,
> - nondigestable BOOLEAN,
> - nonmember_rejection_notice TEXT,
> - obscure_addresses BOOLEAN,
> - owner_chain TEXT,
> - owner_pipeline TEXT,
> - personalize INTEGER,
> - post_id INTEGER,
> - posting_chain TEXT,
> - posting_pipeline TEXT,
> - preferred_language TEXT,
> - private_roster BOOLEAN,
> - display_name TEXT,
> - reject_these_nonmembers BLOB,
> - reply_goes_to_list INTEGER,
> - reply_to_address TEXT,
> - require_explicit_destination BOOLEAN,
> - respond_to_post_requests BOOLEAN,
> - scrub_nondigest BOOLEAN,
> - send_goodbye_message BOOLEAN,
> - send_reminders BOOLEAN,
> - send_welcome_message BOOLEAN,
> - subject_prefix TEXT,
> - subscribe_auto_approval BLOB,
> - subscribe_policy INTEGER,
> - topics BLOB,
> - topics_bodylines_limit INTEGER,
> - topics_enabled BOOLEAN,
> - unsubscribe_policy INTEGER,
> - welcome_message_uri TEXT,
> - PRIMARY KEY (id)
> - );
> -
> -INSERT INTO mailinglist_backup SELECT
> - id,
> - -- List identity
> - list_name,
> - mail_host,
> - include_list_post_header,
> - include_rfc2369_headers,
> - -- Attributes not directly modifiable via the web u/i
> - created_at,
> - admin_member_chunksize,
> - next_request_id,
> - next_digest_number,
> - digest_last_sent_at,
> - volume,
> - last_post_at,
> - accept_these_nonmembers,
> - acceptable_aliases_id,
> - admin_immed_notify,
> - admin_notify_mchanges,
> - administrivia,
> - advertised,
> - anonymous_list,
> - -- Automatic responses.
> - autorespond_owner,
> - autoresponse_owner_text,
> - autorespond_postings,
> - autoresponse_postings_text,
> - autorespond_requests,
> - autoresponse_request_text,
> - autoresponse_grace_period,
> - -- Bounces.
> - forward_unrecognized_bounces_to,
> - process_bounces,
> - bounce_info_stale_after,
> - bounce_matching_headers,
> - bounce_notify_owner_on_disable,
> - bounce_notify_owner_on_removal,
> - bounce_score_threshold,
> - bounce_you_are_disabled_warnings,
> - bounce_you_are_disabled_warnings_interval,
> - -- Content filtering.
> - filter_action,
> - filter_content,
> - collapse_alternatives,
> - convert_html_to_plaintext,
> - default_member_action,
> - default_nonmember_action,
> - description,
> - digest_footer_uri,
> - digest_header_uri,
> - digest_is_default,
> - digest_send_periodic,
> - digest_size_threshold,
> - digest_volume_frequency,
> - digestable,
> - discard_these_nonmembers,
> - emergency,
> - encode_ascii_prefixes,
> - first_strip_reply_to,
> - footer_uri,
> - forward_auto_discards,
> - gateway_to_mail,
> - gateway_to_news,
> - goodbye_message_uri,
> - header_matches,
> - header_uri,
> - hold_these_nonmembers,
> - info,
> - linked_newsgroup,
> - max_days_to_hold,
> - max_message_size,
> - max_num_recipients,
> - member_moderation_notice,
> - mime_is_default_digest,
> - moderator_password,
> - new_member_options,
> - nondigestable,
> - nonmember_rejection_notice,
> - obscure_addresses,
> - owner_chain,
> - owner_pipeline,
> - personalize,
> - post_id,
> - posting_chain,
> - posting_pipeline,
> - preferred_language,
> - private_roster,
> - display_name,
> - reject_these_nonmembers,
> - reply_goes_to_list,
> - reply_to_address,
> - require_explicit_destination,
> - respond_to_post_requests,
> - scrub_nondigest,
> - send_goodbye_message,
> - send_reminders,
> - send_welcome_message,
> - subject_prefix,
> - subscribe_auto_approval,
> - subscribe_policy,
> - topics,
> - topics_bodylines_limit,
> - topics_enabled,
> - unsubscribe_policy,
> - welcome_message_uri
> - FROM mailinglist;
> -
> -CREATE TABLE member_backup(
> - id INTEGER NOT NULL,
> - _member_id TEXT,
> - role INTEGER,
> - moderation_action INTEGER,
> - address_id INTEGER,
> - preferences_id INTEGER,
> - user_id INTEGER,
> - PRIMARY KEY (id)
> - );
> -
> -INSERT INTO member_backup SELECT
> - id,
> - _member_id,
> - role,
> - moderation_action,
> - address_id,
> - preferences_id,
> - user_id
> - FROM member;
> -
> -
> --- Add the new columns. They'll get inserted at the Python layer.
> -ALTER TABLE mailinglist_backup ADD COLUMN archive_policy INTEGER;
> -ALTER TABLE mailinglist_backup ADD COLUMN list_id TEXT;
> -ALTER TABLE mailinglist_backup ADD COLUMN nntp_prefix_subject_too INTEGER;
> -ALTER TABLE mailinglist_backup ADD COLUMN newsgroup_moderation INTEGER;
> -
> -ALTER TABLE member_backup ADD COLUMN list_id TEXT;
>
> === removed file 'src/mailman/database/schema/sqlite_20121015000000_01.sql'
> --- src/mailman/database/schema/sqlite_20121015000000_01.sql 2013-09-01 15:15:08 +0000
> +++ src/mailman/database/schema/sqlite_20121015000000_01.sql 1970-01-01 00:00:00 +0000
> @@ -1,230 +0,0 @@
> --- This file contains the sqlite3 schema migration from
> --- 3.0b2 TO 3.0b3
> ---
> --- 3.0b3 has been released thus you MAY NOT edit this file.
> -
> --- REMOVALS from the ban table:
> --- REM mailing_list
> -
> --- ADDS to the ban table:
> --- ADD list_id
> -
> -CREATE TABLE ban_backup (
> - id INTEGER NOT NULL,
> - email TEXT,
> - PRIMARY KEY (id)
> - );
> -
> -INSERT INTO ban_backup SELECT
> - id, email
> - FROM ban;
> -
> -ALTER TABLE ban_backup ADD COLUMN list_id TEXT;
> -
> --- REMOVALS from the mailinglist table.
> --- REM new_member_options
> --- REM send_reminders
> --- REM subscribe_policy
> --- REM unsubscribe_policy
> --- REM subscribe_auto_approval
> --- REM private_roster
> --- REM admin_member_chunksize
> -
> -CREATE TABLE mailinglist_backup (
> - id INTEGER NOT NULL,
> - list_name TEXT,
> - mail_host TEXT,
> - allow_list_posts BOOLEAN,
> - include_rfc2369_headers BOOLEAN,
> - created_at TIMESTAMP,
> - next_request_id INTEGER,
> - next_digest_number INTEGER,
> - digest_last_sent_at TIMESTAMP,
> - volume INTEGER,
> - last_post_at TIMESTAMP,
> - accept_these_nonmembers BLOB,
> - acceptable_aliases_id INTEGER,
> - admin_immed_notify BOOLEAN,
> - admin_notify_mchanges BOOLEAN,
> - administrivia BOOLEAN,
> - advertised BOOLEAN,
> - anonymous_list BOOLEAN,
> - autorespond_owner INTEGER,
> - autoresponse_owner_text TEXT,
> - autorespond_postings INTEGER,
> - autoresponse_postings_text TEXT,
> - autorespond_requests INTEGER,
> - autoresponse_request_text TEXT,
> - autoresponse_grace_period TEXT,
> - forward_unrecognized_bounces_to INTEGER,
> - process_bounces BOOLEAN,
> - bounce_info_stale_after TEXT,
> - bounce_matching_headers TEXT,
> - bounce_notify_owner_on_disable BOOLEAN,
> - bounce_notify_owner_on_removal BOOLEAN,
> - bounce_score_threshold INTEGER,
> - bounce_you_are_disabled_warnings INTEGER,
> - bounce_you_are_disabled_warnings_interval TEXT,
> - filter_action INTEGER,
> - filter_content BOOLEAN,
> - collapse_alternatives BOOLEAN,
> - convert_html_to_plaintext BOOLEAN,
> - default_member_action INTEGER,
> - default_nonmember_action INTEGER,
> - description TEXT,
> - digest_footer_uri TEXT,
> - digest_header_uri TEXT,
> - digest_is_default BOOLEAN,
> - digest_send_periodic BOOLEAN,
> - digest_size_threshold FLOAT,
> - digest_volume_frequency INTEGER,
> - digestable BOOLEAN,
> - discard_these_nonmembers BLOB,
> - emergency BOOLEAN,
> - encode_ascii_prefixes BOOLEAN,
> - first_strip_reply_to BOOLEAN,
> - footer_uri TEXT,
> - forward_auto_discards BOOLEAN,
> - gateway_to_mail BOOLEAN,
> - gateway_to_news BOOLEAN,
> - goodbye_message_uri TEXT,
> - header_matches BLOB,
> - header_uri TEXT,
> - hold_these_nonmembers BLOB,
> - info TEXT,
> - linked_newsgroup TEXT,
> - max_days_to_hold INTEGER,
> - max_message_size INTEGER,
> - max_num_recipients INTEGER,
> - member_moderation_notice TEXT,
> - mime_is_default_digest BOOLEAN,
> - moderator_password TEXT,
> - nondigestable BOOLEAN,
> - nonmember_rejection_notice TEXT,
> - obscure_addresses BOOLEAN,
> - owner_chain TEXT,
> - owner_pipeline TEXT,
> - personalize INTEGER,
> - post_id INTEGER,
> - posting_chain TEXT,
> - posting_pipeline TEXT,
> - preferred_language TEXT,
> - display_name TEXT,
> - reject_these_nonmembers BLOB,
> - reply_goes_to_list INTEGER,
> - reply_to_address TEXT,
> - require_explicit_destination BOOLEAN,
> - respond_to_post_requests BOOLEAN,
> - scrub_nondigest BOOLEAN,
> - send_goodbye_message BOOLEAN,
> - send_welcome_message BOOLEAN,
> - subject_prefix TEXT,
> - topics BLOB,
> - topics_bodylines_limit INTEGER,
> - topics_enabled BOOLEAN,
> - welcome_message_uri TEXT,
> - archive_policy INTEGER,
> - list_id TEXT,
> - nntp_prefix_subject_too INTEGER,
> - newsgroup_moderation INTEGER,
> - PRIMARY KEY (id)
> - );
> -
> -INSERT INTO mailinglist_backup SELECT
> - id,
> - list_name,
> - mail_host,
> - allow_list_posts,
> - include_rfc2369_headers,
> - created_at,
> - next_request_id,
> - next_digest_number,
> - digest_last_sent_at,
> - volume,
> - last_post_at,
> - accept_these_nonmembers,
> - acceptable_aliases_id,
> - admin_immed_notify,
> - admin_notify_mchanges,
> - administrivia,
> - advertised,
> - anonymous_list,
> - autorespond_owner,
> - autoresponse_owner_text,
> - autorespond_postings,
> - autoresponse_postings_text,
> - autorespond_requests,
> - autoresponse_request_text,
> - autoresponse_grace_period,
> - forward_unrecognized_bounces_to,
> - process_bounces,
> - bounce_info_stale_after,
> - bounce_matching_headers,
> - bounce_notify_owner_on_disable,
> - bounce_notify_owner_on_removal,
> - bounce_score_threshold,
> - bounce_you_are_disabled_warnings,
> - bounce_you_are_disabled_warnings_interval,
> - filter_action,
> - filter_content,
> - collapse_alternatives,
> - convert_html_to_plaintext,
> - default_member_action,
> - default_nonmember_action,
> - description,
> - digest_footer_uri,
> - digest_header_uri,
> - digest_is_default,
> - digest_send_periodic,
> - digest_size_threshold,
> - digest_volume_frequency,
> - digestable,
> - discard_these_nonmembers,
> - emergency,
> - encode_ascii_prefixes,
> - first_strip_reply_to,
> - footer_uri,
> - forward_auto_discards,
> - gateway_to_mail,
> - gateway_to_news,
> - goodbye_message_uri,
> - header_matches,
> - header_uri,
> - hold_these_nonmembers,
> - info,
> - linked_newsgroup,
> - max_days_to_hold,
> - max_message_size,
> - max_num_recipients,
> - member_moderation_notice,
> - mime_is_default_digest,
> - moderator_password,
> - nondigestable,
> - nonmember_rejection_notice,
> - obscure_addresses,
> - owner_chain,
> - owner_pipeline,
> - personalize,
> - post_id,
> - posting_chain,
> - posting_pipeline,
> - preferred_language,
> - display_name,
> - reject_these_nonmembers,
> - reply_goes_to_list,
> - reply_to_address,
> - require_explicit_destination,
> - respond_to_post_requests,
> - scrub_nondigest,
> - send_goodbye_message,
> - send_welcome_message,
> - subject_prefix,
> - topics,
> - topics_bodylines_limit,
> - topics_enabled,
> - welcome_message_uri,
> - archive_policy,
> - list_id,
> - nntp_prefix_subject_too,
> - newsgroup_moderation
> - FROM mailinglist;
>
> === removed file 'src/mailman/database/schema/sqlite_20130406000000_01.sql'
> --- src/mailman/database/schema/sqlite_20130406000000_01.sql 2013-11-26 02:26:15 +0000
> +++ src/mailman/database/schema/sqlite_20130406000000_01.sql 1970-01-01 00:00:00 +0000
> @@ -1,46 +0,0 @@
> --- This file contains the SQLite schema migration from
> --- 3.0b3 to 3.0b4
> ---
> --- After 3.0b4 is released you may not edit this file.
> -
> --- For SQLite3 migration strategy, see
> --- http://sqlite.org/faq.html#q11
> -
> --- ADD listarchiver table.
> -
> --- REMOVALs from the bounceevent table:
> --- REM list_name
> -
> --- ADDs to the bounceevent table:
> --- ADD list_id
> -
> --- ADDs to the mailinglist table:
> --- ADD archiver_id
> -
> -CREATE TABLE bounceevent_backup (
> - id INTEGER NOT NULL,
> - email TEXT,
> - 'timestamp' TIMESTAMP,
> - message_id TEXT,
> - context INTEGER,
> - processed BOOLEAN,
> - PRIMARY KEY (id)
> - );
> -
> -INSERT INTO bounceevent_backup SELECT
> - id, email, "timestamp", message_id,
> - context, processed
> - FROM bounceevent;
> -
> -ALTER TABLE bounceevent_backup ADD COLUMN list_id TEXT;
> -
> -CREATE TABLE listarchiver (
> - id INTEGER NOT NULL,
> - mailing_list_id INTEGER NOT NULL,
> - name TEXT NOT NULL,
> - _is_enabled BOOLEAN,
> - PRIMARY KEY (id)
> - );
> -
> -CREATE INDEX ix_listarchiver_mailing_list_id
> - ON listarchiver(mailing_list_id);
>
> === modified file 'src/mailman/database/sqlite.py'
> --- src/mailman/database/sqlite.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/database/sqlite.py 2014-10-14 01:13:52 +0000
> @@ -22,63 +22,27 @@
> __metaclass__ = type
> __all__ = [
> 'SQLiteDatabase',
> - 'make_temporary',
> ]
>
>
> import os
> -import types
> -import shutil
> -import tempfile
>
> -from functools import partial
> +from mailman.database.base import SABaseDatabase
> from urlparse import urlparse
>
> -from mailman.database.base import StormBaseDatabase
> -from mailman.testing.helpers import configuration
> -
>
>
> -class SQLiteDatabase(StormBaseDatabase):
> +class SQLiteDatabase(SABaseDatabase):
> """Database class for SQLite."""
>
> - TAG = 'sqlite'
> -
> - def _database_exists(self, store):
> - """See `BaseDatabase`."""
> - table_query = 'select tbl_name from sqlite_master;'
> - table_names = set(item[0] for item in
> - store.execute(table_query))
> - return 'version' in table_names
> -
> def _prepare(self, url):
> parts = urlparse(url)
> assert parts.scheme == 'sqlite', (
> 'Database url mismatch (expected sqlite prefix): {0}'.format(url))
> + # Ensure that the SQLite database file has the proper permissions,
> + # since SQLite doesn't play nice with umask.
> path = os.path.normpath(parts.path)
> - fd = os.open(path, os.O_WRONLY | os.O_NONBLOCK | os.O_CREAT, 0666)
> + fd = os.open(path, os.O_WRONLY | os.O_NONBLOCK | os.O_CREAT, 0o666)
> # Ignore errors
> if fd > 0:
> os.close(fd)
> -
> -
> -
> -# Test suite adapter for ITemporaryDatabase.
> -
> -def _cleanup(self, tempdir):
> - shutil.rmtree(tempdir)
> -
> -
> -def make_temporary(database):
> - """Adapts by monkey patching an existing SQLite IDatabase."""
> - tempdir = tempfile.mkdtemp()
> - url = 'sqlite:///' + os.path.join(tempdir, 'mailman.db')
> - with configuration('database', url=url):
> - database.initialize()
> - database._cleanup = types.MethodType(
> - partial(_cleanup, tempdir=tempdir),
> - database)
> - # bool column values in SQLite must be integers.
> - database.FALSE = 0
> - database.TRUE = 1
> - return database
>
Is `0o666` correct permission type? I think it should be 0666 like before.
> === added directory 'src/mailman/database/tests'
> === removed directory 'src/mailman/database/tests'
> === added file 'src/mailman/database/tests/__init__.py'
> === removed file 'src/mailman/database/tests/__init__.py'
> === removed directory 'src/mailman/database/tests/data'
> === removed file 'src/mailman/database/tests/data/__init__.py'
> === removed file 'src/mailman/database/tests/data/mailman_01.db'
> Binary files src/mailman/database/tests/data/mailman_01.db 2012-04-20 21:32:27 +0000 and src/mailman/database/tests/data/mailman_01.db 1970-01-01 00:00:00 +0000 differ
> === removed file 'src/mailman/database/tests/data/migration_postgres_1.sql'
> --- src/mailman/database/tests/data/migration_postgres_1.sql 2012-07-26 04:22:19 +0000
> +++ src/mailman/database/tests/data/migration_postgres_1.sql 1970-01-01 00:00:00 +0000
> @@ -1,133 +0,0 @@
> -INSERT INTO "acceptablealias" VALUES(1,'foo(a)example.com',1);
> -INSERT INTO "acceptablealias" VALUES(2,'bar(a)example.com',1);
> -
> -INSERT INTO "address" VALUES(
> - 1,'anne(a)example.com',NULL,'Anne Person',
> - '2012-04-19 00:52:24.826432','2012-04-19 00:49:42.373769',1,2);
> -INSERT INTO "address" VALUES(
> - 2,'bart(a)example.com',NULL,'Bart Person',
> - '2012-04-19 00:53:25.878800','2012-04-19 00:49:52.882050',2,4);
> -
> -INSERT INTO "domain" VALUES(
> - 1,'example.com','http://example.com',NULL,'postmaster(a)example.com');
> -
> -INSERT INTO "mailinglist" VALUES(
> - -- id,list_name,mail_host,include_list_post_header,include_rfc2369_headers
> - 1,'test','example.com',True,True,
> - -- created_at,admin_member_chunksize,next_request_id,next_digest_number
> - '2012-04-19 00:46:13.173844',30,1,1,
> - -- digest_last_sent_at,volume,last_post_at,accept_these_nonmembers
> - NULL,1,NULL,E'\\x80025D71012E',
> - -- acceptable_aliases_id,admin_immed_notify,admin_notify_mchanges
> - NULL,True,False,
> - -- administrivia,advertised,anonymous_list,archive,archive_private
> - True,True,False,True,False,
> - -- archive_volume_frequency
> - 1,
> - --autorespond_owner,autoresponse_owner_text
> - 0,'',
> - -- autorespond_postings,autoresponse_postings_text
> - 0,'',
> - -- autorespond_requests,authoresponse_requests_text
> - 0,'',
> - -- autoresponse_grace_period
> - '90 days, 0:00:00',
> - -- forward_unrecognized_bounces_to,process_bounces
> - 1,True,
> - -- bounce_info_stale_after,bounce_matching_headers
> - '7 days, 0:00:00','
> -# Lines that *start* with a ''#'' are comments.
> -to: friend(a)public.com
> -message-id: relay.comanche.denmark.eu
> -from: list(a)listme.com
> -from: .*(a)uplinkpro.com
> -',
> - -- bounce_notify_owner_on_disable,bounce_notify_owner_on_removal
> - True,True,
> - -- bounce_score_threshold,bounce_you_are_disabled_warnings
> - 5,3,
> - -- bounce_you_are_disabled_warnings_interval
> - '7 days, 0:00:00',
> - -- filter_action,filter_content,collapse_alternatives
> - 2,False,True,
> - -- convert_html_to_plaintext,default_member_action,default_nonmember_action
> - False,4,0,
> - -- description
> - '',
> - -- digest_footer_uri
> - 'mailman:///$listname/$language/footer-generic.txt',
> - -- digest_header_uri
> - NULL,
> - -- digest_is_default,digest_send_periodic,digest_size_threshold
> - False,True,30.0,
> - -- digest_volume_frequency,digestable,discard_these_nonmembers
> - 1,True,E'\\x80025D71012E',
> - -- emergency,encode_ascii_prefixes,first_strip_reply_to
> - False,False,False,
> - -- footer_uri
> - 'mailman:///$listname/$language/footer-generic.txt',
> - -- forward_auto_discards,gateway_to_mail,gateway_to_news
> - True,False,FAlse,
> - -- generic_nonmember_action,goodby_message_uri
> - 1,'',
> - -- header_matches,header_uri,hold_these_nonmembers,info,linked_newsgroup
> - E'\\x80025D71012E',NULL,E'\\x80025D71012E','','',
> - -- max_days_to_hold,max_message_size,max_num_recipients
> - 0,40,10,
> - -- member_moderation_notice,mime_is_default_digest,moderator_password
> - '',False,NULL,
> - -- new_member_options,news_moderation,news_prefix_subject_too
> - 256,0,True,
> - -- nntp_host,nondigestable,nonmember_rejection_notice,obscure_addresses
> - '',True,'',True,
> - -- owner_chain,owner_pipeline,personalize,post_id
> - 'default-owner-chain','default-owner-pipeline',0,1,
> - -- posting_chain,posting_pipeline,preferred_language,private_roster
> - 'default-posting-chain','default-posting-pipeline','en',True,
> - -- display_name,reject_these_nonmembers
> - 'Test',E'\\x80025D71012E',
> - -- reply_goes_to_list,reply_to_address
> - 0,'',
> - -- require_explicit_destination,respond_to_post_requests
> - True,True,
> - -- scrub_nondigest,send_goodbye_message,send_reminders,send_welcome_message
> - False,True,True,True,
> - -- subject_prefix,subscribe_auto_approval
> - '[Test] ',E'\\x80025D71012E',
> - -- subscribe_policy,topics,topics_bodylines_limit,topics_enabled
> - 1,E'\\x80025D71012E',5,False,
> - -- unsubscribe_policy,welcome_message_uri
> - 0,'mailman:///welcome.txt');
> -
> -INSERT INTO "member" VALUES(
> - 1,'d1243f4d-e604-4f6b-af52-98d0a7bce0f1',1,'test(a)example.com',4,NULL,5,1);
> -INSERT INTO "member" VALUES(
> - 2,'dccc3851-fdfb-4afa-90cf-bdcbf80ad0fd',2,'test(a)example.com',3,NULL,6,1);
> -INSERT INTO "member" VALUES(
> - 3,'479be431-45f2-473d-bc3c-7eac614030ac',3,'test(a)example.com',3,NULL,7,2);
> -INSERT INTO "member" VALUES(
> - 4,'e2dc604c-d93a-4b91-b5a8-749e3caade36',1,'test(a)example.com',4,NULL,8,2);
> -
> -INSERT INTO "preferences" VALUES(1,NULL,NULL,NULL,NULL,NULL,NULL,NULL);
> -INSERT INTO "preferences" VALUES(2,NULL,NULL,NULL,NULL,NULL,NULL,NULL);
> -INSERT INTO "preferences" VALUES(3,NULL,NULL,NULL,NULL,NULL,NULL,NULL);
> -INSERT INTO "preferences" VALUES(4,NULL,NULL,NULL,NULL,NULL,NULL,NULL);
> -INSERT INTO "preferences" VALUES(5,NULL,NULL,NULL,NULL,NULL,NULL,NULL);
> -INSERT INTO "preferences" VALUES(6,NULL,NULL,NULL,NULL,NULL,NULL,NULL);
> -INSERT INTO "preferences" VALUES(7,NULL,NULL,NULL,NULL,NULL,NULL,NULL);
> -INSERT INTO "preferences" VALUES(8,NULL,NULL,NULL,NULL,NULL,NULL,NULL);
> -
> -INSERT INTO "user" VALUES(
> - 1,'Anne Person',NULL,'0adf3caa-6f26-46f8-a11d-5256c8148592',
> - '2012-04-19 00:49:42.370493',1,1);
> -INSERT INTO "user" VALUES(
> - 2,'Bart Person',NULL,'63f5d1a2-e533-4055-afe4-475dec3b1163',
> - '2012-04-19 00:49:52.868746',2,3);
> -
> -INSERT INTO "uid" VALUES(1,'8bf9a615-f23e-4980-b7d1-90ac0203c66f');
> -INSERT INTO "uid" VALUES(2,'0adf3caa-6f26-46f8-a11d-5256c8148592');
> -INSERT INTO "uid" VALUES(3,'63f5d1a2-e533-4055-afe4-475dec3b1163');
> -INSERT INTO "uid" VALUES(4,'d1243f4d-e604-4f6b-af52-98d0a7bce0f1');
> -INSERT INTO "uid" VALUES(5,'dccc3851-fdfb-4afa-90cf-bdcbf80ad0fd');
> -INSERT INTO "uid" VALUES(6,'479be431-45f2-473d-bc3c-7eac614030ac');
> -INSERT INTO "uid" VALUES(7,'e2dc604c-d93a-4b91-b5a8-749e3caade36');
>
> === removed file 'src/mailman/database/tests/data/migration_sqlite_1.sql'
> --- src/mailman/database/tests/data/migration_sqlite_1.sql 2012-07-26 04:22:19 +0000
> +++ src/mailman/database/tests/data/migration_sqlite_1.sql 1970-01-01 00:00:00 +0000
> @@ -1,133 +0,0 @@
> -INSERT INTO "acceptablealias" VALUES(1,'foo(a)example.com',1);
> -INSERT INTO "acceptablealias" VALUES(2,'bar(a)example.com',1);
> -
> -INSERT INTO "address" VALUES(
> - 1,'anne(a)example.com',NULL,'Anne Person',
> - '2012-04-19 00:52:24.826432','2012-04-19 00:49:42.373769',1,2);
> -INSERT INTO "address" VALUES(
> - 2,'bart(a)example.com',NULL,'Bart Person',
> - '2012-04-19 00:53:25.878800','2012-04-19 00:49:52.882050',2,4);
> -
> -INSERT INTO "domain" VALUES(
> - 1,'example.com','http://example.com',NULL,'postmaster(a)example.com');
> -
> -INSERT INTO "mailinglist" VALUES(
> - -- id,list_name,mail_host,include_list_post_header,include_rfc2369_headers
> - 1,'test','example.com',1,1,
> - -- created_at,admin_member_chunksize,next_request_id,next_digest_number
> - '2012-04-19 00:46:13.173844',30,1,1,
> - -- digest_last_sent_at,volume,last_post_at,accept_these_nonmembers
> - NULL,1,NULL,X'80025D71012E',
> - -- acceptable_aliases_id,admin_immed_notify,admin_notify_mchanges
> - NULL,1,0,
> - -- administrivia,advertised,anonymous_list,archive,archive_private
> - 1,1,0,1,0,
> - -- archive_volume_frequency
> - 1,
> - --autorespond_owner,autoresponse_owner_text
> - 0,'',
> - -- autorespond_postings,autoresponse_postings_text
> - 0,'',
> - -- autorespond_requests,authoresponse_requests_text
> - 0,'',
> - -- autoresponse_grace_period
> - '90 days, 0:00:00',
> - -- forward_unrecognized_bounces_to,process_bounces
> - 1,1,
> - -- bounce_info_stale_after,bounce_matching_headers
> - '7 days, 0:00:00','
> -# Lines that *start* with a ''#'' are comments.
> -to: friend(a)public.com
> -message-id: relay.comanche.denmark.eu
> -from: list(a)listme.com
> -from: .*(a)uplinkpro.com
> -',
> - -- bounce_notify_owner_on_disable,bounce_notify_owner_on_removal
> - 1,1,
> - -- bounce_score_threshold,bounce_you_are_disabled_warnings
> - 5,3,
> - -- bounce_you_are_disabled_warnings_interval
> - '7 days, 0:00:00',
> - -- filter_action,filter_content,collapse_alternatives
> - 2,0,1,
> - -- convert_html_to_plaintext,default_member_action,default_nonmember_action
> - 0,4,0,
> - -- description
> - '',
> - -- digest_footer_uri
> - 'mailman:///$listname/$language/footer-generic.txt',
> - -- digest_header_uri
> - NULL,
> - -- digest_is_default,digest_send_periodic,digest_size_threshold
> - 0,1,30.0,
> - -- digest_volume_frequency,digestable,discard_these_nonmembers
> - 1,1,X'80025D71012E',
> - -- emergency,encode_ascii_prefixes,first_strip_reply_to
> - 0,0,0,
> - -- footer_uri
> - 'mailman:///$listname/$language/footer-generic.txt',
> - -- forward_auto_discards,gateway_to_mail,gateway_to_news
> - 1,0,0,
> - -- generic_nonmember_action,goodby_message_uri
> - 1,'',
> - -- header_matches,header_uri,hold_these_nonmembers,info,linked_newsgroup
> - X'80025D71012E',NULL,X'80025D71012E','','',
> - -- max_days_to_hold,max_message_size,max_num_recipients
> - 0,40,10,
> - -- member_moderation_notice,mime_is_default_digest,moderator_password
> - '',0,NULL,
> - -- new_member_options,news_moderation,news_prefix_subject_too
> - 256,0,1,
> - -- nntp_host,nondigestable,nonmember_rejection_notice,obscure_addresses
> - '',1,'',1,
> - -- owner_chain,owner_pipeline,personalize,post_id
> - 'default-owner-chain','default-owner-pipeline',0,1,
> - -- posting_chain,posting_pipeline,preferred_language,private_roster
> - 'default-posting-chain','default-posting-pipeline','en',1,
> - -- display_name,reject_these_nonmembers
> - 'Test',X'80025D71012E',
> - -- reply_goes_to_list,reply_to_address
> - 0,'',
> - -- require_explicit_destination,respond_to_post_requests
> - 1,1,
> - -- scrub_nondigest,send_goodbye_message,send_reminders,send_welcome_message
> - 0,1,1,1,
> - -- subject_prefix,subscribe_auto_approval
> - '[Test] ',X'80025D71012E',
> - -- subscribe_policy,topics,topics_bodylines_limit,topics_enabled
> - 1,X'80025D71012E',5,0,
> - -- unsubscribe_policy,welcome_message_uri
> - 0,'mailman:///welcome.txt');
> -
> -INSERT INTO "member" VALUES(
> - 1,'d1243f4d-e604-4f6b-af52-98d0a7bce0f1',1,'test(a)example.com',4,NULL,5,1);
> -INSERT INTO "member" VALUES(
> - 2,'dccc3851-fdfb-4afa-90cf-bdcbf80ad0fd',2,'test(a)example.com',3,NULL,6,1);
> -INSERT INTO "member" VALUES(
> - 3,'479be431-45f2-473d-bc3c-7eac614030ac',3,'test(a)example.com',3,NULL,7,2);
> -INSERT INTO "member" VALUES(
> - 4,'e2dc604c-d93a-4b91-b5a8-749e3caade36',1,'test(a)example.com',4,NULL,8,2);
> -
> -INSERT INTO "preferences" VALUES(1,NULL,NULL,NULL,NULL,NULL,NULL,NULL);
> -INSERT INTO "preferences" VALUES(2,NULL,NULL,NULL,NULL,NULL,NULL,NULL);
> -INSERT INTO "preferences" VALUES(3,NULL,NULL,NULL,NULL,NULL,NULL,NULL);
> -INSERT INTO "preferences" VALUES(4,NULL,NULL,NULL,NULL,NULL,NULL,NULL);
> -INSERT INTO "preferences" VALUES(5,NULL,NULL,NULL,NULL,NULL,NULL,NULL);
> -INSERT INTO "preferences" VALUES(6,NULL,NULL,NULL,NULL,NULL,NULL,NULL);
> -INSERT INTO "preferences" VALUES(7,NULL,NULL,NULL,NULL,NULL,NULL,NULL);
> -INSERT INTO "preferences" VALUES(8,NULL,NULL,NULL,NULL,NULL,NULL,NULL);
> -
> -INSERT INTO "user" VALUES(
> - 1,'Anne Person',NULL,'0adf3caa-6f26-46f8-a11d-5256c8148592',
> - '2012-04-19 00:49:42.370493',1,1);
> -INSERT INTO "user" VALUES(
> - 2,'Bart Person',NULL,'63f5d1a2-e533-4055-afe4-475dec3b1163',
> - '2012-04-19 00:49:52.868746',2,3);
> -
> -INSERT INTO "uid" VALUES(1,'8bf9a615-f23e-4980-b7d1-90ac0203c66f');
> -INSERT INTO "uid" VALUES(2,'0adf3caa-6f26-46f8-a11d-5256c8148592');
> -INSERT INTO "uid" VALUES(3,'63f5d1a2-e533-4055-afe4-475dec3b1163');
> -INSERT INTO "uid" VALUES(4,'d1243f4d-e604-4f6b-af52-98d0a7bce0f1');
> -INSERT INTO "uid" VALUES(5,'dccc3851-fdfb-4afa-90cf-bdcbf80ad0fd');
> -INSERT INTO "uid" VALUES(6,'479be431-45f2-473d-bc3c-7eac614030ac');
> -INSERT INTO "uid" VALUES(7,'e2dc604c-d93a-4b91-b5a8-749e3caade36');
>
> === added file 'src/mailman/database/tests/test_factory.py'
> --- src/mailman/database/tests/test_factory.py 1970-01-01 00:00:00 +0000
> +++ src/mailman/database/tests/test_factory.py 2014-10-14 01:13:52 +0000
> @@ -0,0 +1,160 @@
> +# Copyright (C) 2013-2014 by the Free Software Foundation, Inc.
> +#
> +# This file is part of GNU Mailman.
> +#
> +# GNU Mailman is free software: you can redistribute it and/or modify it under
> +# the terms of the GNU General Public License as published by the Free
> +# Software Foundation, either version 3 of the License, or (at your option)
> +# any later version.
> +#
> +# GNU Mailman is distributed in the hope that it will be useful, but WITHOUT
> +# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
> +# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
> +# more details.
> +#
> +# You should have received a copy of the GNU General Public License along with
> +# GNU Mailman. If not, see <http://www.gnu.org/licenses/>.
> +
> +"""Test database schema migrations"""
> +
> +from __future__ import absolute_import, print_function, unicode_literals
> +
> +__metaclass__ = type
> +__all__ = [
> + 'TestSchemaManager',
> + ]
> +
> +
> +import unittest
> +import alembic.command
> +
> +from mock import patch
> +from sqlalchemy import MetaData, Table, Column, Integer, Unicode
> +from sqlalchemy.exc import ProgrammingError, OperationalError
> +from sqlalchemy.schema import Index
> +
> +from mailman.config import config
> +from mailman.database.alembic import alembic_cfg
> +from mailman.database.factory import LAST_STORM_SCHEMA_VERSION, SchemaManager
> +from mailman.database.model import Model
> +from mailman.interfaces.database import DatabaseError
> +from mailman.testing.layers import ConfigLayer
> +
> +
> +
> +class TestSchemaManager(unittest.TestCase):
> +
> + layer = ConfigLayer
> +
> + def setUp(self):
> + # Drop the existing database.
> + Model.metadata.drop_all(config.db.engine)
> + md = MetaData()
> + md.reflect(bind=config.db.engine)
> + for tablename in ('alembic_version', 'version'):
> + if tablename in md.tables:
> + md.tables[tablename].drop(config.db.engine)
> + self.schema_mgr = SchemaManager(config.db)
> +
> + def tearDown(self):
> + self._drop_storm_database()
> + # Restore a virgin database.
> + Model.metadata.create_all(config.db.engine)
> +
> + def _table_exists(self, tablename):
> + md = MetaData()
> + md.reflect(bind=config.db.engine)
> + return tablename in md.tables
> +
> + def _create_storm_database(self, revision):
> + version_table = Table(
> + 'version', Model.metadata,
> + Column('id', Integer, primary_key=True),
> + Column('component', Unicode),
> + Column('version', Unicode),
> + )
> + version_table.create(config.db.engine)
> + config.db.store.execute(version_table.insert().values(
> + component='schema', version=revision))
> + config.db.commit()
> + # Other Storm specific changes, those SQL statements hopefully work on
> + # all DB engines...
> + config.db.engine.execute(
> + 'ALTER TABLE mailinglist ADD COLUMN acceptable_aliases_id INT')
> + Index('ix_user__user_id').drop(bind=config.db.engine)
> + # Don't pollute our main metadata object, create a new one.
> + md = MetaData()
> + user_table = Model.metadata.tables['user'].tometadata(md)
> + Index('ix_user_user_id', user_table.c._user_id).create(
> + bind=config.db.engine)
> + config.db.commit()
> +
> + def _drop_storm_database(self):
> + """Remove the leftovers from a Storm DB.
> +
> + A drop_all() must be issued afterwards.
> + """
> + if 'version' in Model.metadata.tables:
> + version = Model.metadata.tables['version']
> + version.drop(config.db.engine, checkfirst=True)
> + Model.metadata.remove(version)
> + try:
> + Index('ix_user_user_id').drop(bind=config.db.engine)
> + except (ProgrammingError, OperationalError):
> + # Nonexistent. PostgreSQL raises a ProgrammingError, while SQLite
> + # raises an OperationalError.
> + pass
> + config.db.commit()
> +
> + def test_current_database(self):
> + # The database is already at the latest version.
> + alembic.command.stamp(alembic_cfg, 'head')
> + with patch('alembic.command') as alembic_command:
> + self.schema_mgr.setup_database()
> + self.assertFalse(alembic_command.stamp.called)
> + self.assertFalse(alembic_command.upgrade.called)
> +
> + @patch('alembic.command')
> + def test_initial(self, alembic_command):
> + # No existing database.
> + self.assertFalse(self._table_exists('mailinglist'))
> + self.assertFalse(self._table_exists('alembic_version'))
> + self.schema_mgr.setup_database()
> + self.assertFalse(alembic_command.upgrade.called)
> + self.assertTrue(self._table_exists('mailinglist'))
> + self.assertTrue(self._table_exists('alembic_version'))
> +
> + @patch('alembic.command.stamp')
> + def test_storm(self, alembic_command_stamp):
> + # Existing Storm database.
> + Model.metadata.create_all(config.db.engine)
> + self._create_storm_database(LAST_STORM_SCHEMA_VERSION)
> + self.schema_mgr.setup_database()
> + self.assertFalse(alembic_command_stamp.called)
> + self.assertTrue(
> + self._table_exists('mailinglist')
> + and self._table_exists('alembic_version')
> + and not self._table_exists('version'))
> +
> + @patch('alembic.command')
> + def test_old_storm(self, alembic_command):
> + # Existing Storm database in an old version.
> + Model.metadata.create_all(config.db.engine)
> + self._create_storm_database('001')
> + self.assertRaises(DatabaseError, self.schema_mgr.setup_database)
> + self.assertFalse(alembic_command.stamp.called)
> + self.assertFalse(alembic_command.upgrade.called)
> +
> + def test_old_db(self):
> + # The database is in an old revision, must upgrade.
> + alembic.command.stamp(alembic_cfg, 'head')
> + md = MetaData()
> + md.reflect(bind=config.db.engine)
> + config.db.store.execute(md.tables['alembic_version'].delete())
> + config.db.store.execute(md.tables['alembic_version'].insert().values(
> + version_num='dummyrevision'))
> + config.db.commit()
> + with patch('alembic.command') as alembic_command:
> + self.schema_mgr.setup_database()
> + self.assertFalse(alembic_command.stamp.called)
> + self.assertTrue(alembic_command.upgrade.called)
>
> === removed file 'src/mailman/database/tests/test_migrations.py'
> --- src/mailman/database/tests/test_migrations.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/database/tests/test_migrations.py 1970-01-01 00:00:00 +0000
> @@ -1,506 +0,0 @@
> -# Copyright (C) 2012-2014 by the Free Software Foundation, Inc.
> -#
> -# This file is part of GNU Mailman.
> -#
> -# GNU Mailman is free software: you can redistribute it and/or modify it under
> -# the terms of the GNU General Public License as published by the Free
> -# Software Foundation, either version 3 of the License, or (at your option)
> -# any later version.
> -#
> -# GNU Mailman is distributed in the hope that it will be useful, but WITHOUT
> -# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
> -# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
> -# more details.
> -#
> -# You should have received a copy of the GNU General Public License along with
> -# GNU Mailman. If not, see <http://www.gnu.org/licenses/>.
> -
> -"""Test schema migrations."""
> -
> -from __future__ import absolute_import, print_function, unicode_literals
> -
> -__metaclass__ = type
> -__all__ = [
> - 'TestMigration20120407MigratedData',
> - 'TestMigration20120407Schema',
> - 'TestMigration20120407UnchangedData',
> - 'TestMigration20121015MigratedData',
> - 'TestMigration20121015Schema',
> - 'TestMigration20130406MigratedData',
> - 'TestMigration20130406Schema',
> - ]
> -
> -
> -import unittest
> -
> -from datetime import datetime
> -from operator import attrgetter
> -from pkg_resources import resource_string
> -from sqlite3 import OperationalError
> -from storm.exceptions import DatabaseError
> -from zope.component import getUtility
> -
> -from mailman.interfaces.database import IDatabaseFactory
> -from mailman.interfaces.domain import IDomainManager
> -from mailman.interfaces.archiver import ArchivePolicy
> -from mailman.interfaces.bounce import BounceContext
> -from mailman.interfaces.listmanager import IListManager
> -from mailman.interfaces.mailinglist import IAcceptableAliasSet
> -from mailman.interfaces.nntp import NewsgroupModeration
> -from mailman.interfaces.subscriptions import ISubscriptionService
> -from mailman.model.bans import Ban
> -from mailman.model.bounce import BounceEvent
> -from mailman.testing.helpers import temporary_db
> -from mailman.testing.layers import ConfigLayer
> -
> -
> -
> -class MigrationTestBase(unittest.TestCase):
> - """Test database migrations."""
> -
> - layer = ConfigLayer
> -
> - def setUp(self):
> - self._database = getUtility(IDatabaseFactory, 'temporary').create()
> -
> - def tearDown(self):
> - self._database._cleanup()
> -
> - def _table_missing_present(self, migrations, missing, present):
> - """The appropriate migrations leave some tables missing and present.
> -
> - :param migrations: Sequence of migrations to load.
> - :param missing: Tables which should be missing.
> - :param present: Tables which should be present.
> - """
> - for migration in migrations:
> - self._database.load_migrations(migration)
> - self._database.store.commit()
> - for table in missing:
> - self.assertRaises(OperationalError,
> - self._database.store.execute,
> - 'select * from {};'.format(table))
> - for table in present:
> - self._database.store.execute('select * from {};'.format(table))
> -
> - def _missing_present(self, table, migrations, missing, present):
> - """The appropriate migrations leave columns missing and present.
> -
> - :param table: The table to test columns from.
> - :param migrations: Sequence of migrations to load.
> - :param missing: Set of columns which should be missing after the
> - migrations are loaded.
> - :param present: Set of columns which should be present after the
> - migrations are loaded.
> - """
> - for migration in migrations:
> - self._database.load_migrations(migration)
> - self._database.store.commit()
> - for column in missing:
> - self.assertRaises(DatabaseError,
> - self._database.store.execute,
> - 'select {0} from {1};'.format(column, table))
> - self._database.store.rollback()
> - for column in present:
> - # This should not produce an exception. Is there some better test
> - # that we can perform?
> - self._database.store.execute(
> - 'select {0} from {1};'.format(column, table))
> -
> -
> -
> -class TestMigration20120407Schema(MigrationTestBase):
> - """Test column migrations."""
> -
> - def test_pre_upgrade_columns_migration(self):
> - # Test that before the migration, the old table columns are present
> - # and the new database columns are not.
> - self._missing_present('mailinglist',
> - ['20120406999999'],
> - # New columns are missing.
> - ('allow_list_posts',
> - 'archive_policy',
> - 'list_id',
> - 'nntp_prefix_subject_too'),
> - # Old columns are present.
> - ('archive',
> - 'archive_private',
> - 'archive_volume_frequency',
> - 'generic_nonmember_action',
> - 'include_list_post_header',
> - 'news_moderation',
> - 'news_prefix_subject_too',
> - 'nntp_host'))
> - self._missing_present('member',
> - ['20120406999999'],
> - ('list_id',),
> - ('mailing_list',))
> -
> - def test_post_upgrade_columns_migration(self):
> - # Test that after the migration, the old table columns are missing
> - # and the new database columns are present.
> - self._missing_present('mailinglist',
> - ['20120406999999',
> - '20120407000000'],
> - # The old columns are missing.
> - ('archive',
> - 'archive_private',
> - 'archive_volume_frequency',
> - 'generic_nonmember_action',
> - 'include_list_post_header',
> - 'news_moderation',
> - 'news_prefix_subject_too',
> - 'nntp_host'),
> - # The new columns are present.
> - ('allow_list_posts',
> - 'archive_policy',
> - 'list_id',
> - 'nntp_prefix_subject_too'))
> - self._missing_present('member',
> - ['20120406999999',
> - '20120407000000'],
> - ('mailing_list',),
> - ('list_id',))
> -
> -
> -
> -class TestMigration20120407UnchangedData(MigrationTestBase):
> - """Test non-migrated data."""
> -
> - def setUp(self):
> - MigrationTestBase.setUp(self)
> - # Load all the migrations to just before the one we're testing.
> - self._database.load_migrations('20120406999999')
> - # Load the previous schema's sample data.
> - sample_data = resource_string(
> - 'mailman.database.tests.data',
> - 'migration_{0}_1.sql'.format(self._database.TAG))
> - self._database.load_sql(self._database.store, sample_data)
> - # XXX 2012-12-28: We have to load the last migration defined in the
> - # system, otherwise the ORM model will not match the SQL table
> - # definitions and we'll get OperationalErrors from SQLite.
> - self._database.load_migrations('20121015000000')
> -
> - def test_migration_domains(self):
> - # Test that the domains table, which isn't touched, doesn't change.
> - with temporary_db(self._database):
> - # Check that the domains survived the migration. This table
> - # was not touched so it should be fine.
> - domains = list(getUtility(IDomainManager))
> - self.assertEqual(len(domains), 1)
> - self.assertEqual(domains[0].mail_host, 'example.com')
> -
> - def test_migration_mailing_lists(self):
> - # Test that the mailing lists survive migration.
> - with temporary_db(self._database):
> - # There should be exactly one mailing list defined.
> - mlists = list(getUtility(IListManager).mailing_lists)
> - self.assertEqual(len(mlists), 1)
> - self.assertEqual(mlists[0].fqdn_listname, 'test(a)example.com')
> -
> - def test_migration_acceptable_aliases(self):
> - # Test that the mailing list's acceptable aliases survive migration.
> - # This proves that foreign key references are migrated properly.
> - with temporary_db(self._database):
> - mlist = getUtility(IListManager).get('test(a)example.com')
> - aliases_set = IAcceptableAliasSet(mlist)
> - self.assertEqual(set(aliases_set.aliases),
> - set(['foo(a)example.com', 'bar(a)example.com']))
> -
> - def test_migration_members(self):
> - # Test that the members of a mailing list all survive migration.
> - with temporary_db(self._database):
> - mlist = getUtility(IListManager).get('test(a)example.com')
> - # Test that all the members we expect are still there. Start with
> - # the two list delivery members.
> - addresses = set(address.email
> - for address in mlist.members.addresses)
> - self.assertEqual(addresses,
> - set(['anne(a)example.com', 'bart(a)example.com']))
> - # There is one owner.
> - owners = set(address.email for address in mlist.owners.addresses)
> - self.assertEqual(len(owners), 1)
> - self.assertEqual(owners.pop(), 'anne(a)example.com')
> - # There is one moderator.
> - moderators = set(address.email
> - for address in mlist.moderators.addresses)
> - self.assertEqual(len(moderators), 1)
> - self.assertEqual(moderators.pop(), 'bart(a)example.com')
> -
> -
> -
> -class TestMigration20120407MigratedData(MigrationTestBase):
> - """Test affected migration data."""
> -
> - def setUp(self):
> - MigrationTestBase.setUp(self)
> - # Load all the migrations to just before the one we're testing.
> - self._database.load_migrations('20120406999999')
> - # Load the previous schema's sample data.
> - sample_data = resource_string(
> - 'mailman.database.tests.data',
> - 'migration_{0}_1.sql'.format(self._database.TAG))
> - self._database.load_sql(self._database.store, sample_data)
> -
> - def _upgrade(self):
> - # XXX 2012-12-28: We have to load the last migration defined in the
> - # system, otherwise the ORM model will not match the SQL table
> - # definitions and we'll get OperationalErrors from SQLite.
> - self._database.load_migrations('20121015000000')
> -
> - def test_migration_archive_policy_never_0(self):
> - # Test that the new archive_policy value is updated correctly. In the
> - # case of old column archive=0, the archive_private column is
> - # ignored. This test sets it to 0 to ensure it's ignored.
> - self._database.store.execute(
> - 'UPDATE mailinglist SET archive = {0}, archive_private = {0} '
> - 'WHERE id = 1;'.format(self._database.FALSE))
> - # Complete the migration
> - self._upgrade()
> - with temporary_db(self._database):
> - mlist = getUtility(IListManager).get('test(a)example.com')
> - self.assertEqual(mlist.archive_policy, ArchivePolicy.never)
> -
> - def test_migration_archive_policy_never_1(self):
> - # Test that the new archive_policy value is updated correctly. In the
> - # case of old column archive=0, the archive_private column is
> - # ignored. This test sets it to 1 to ensure it's ignored.
> - self._database.store.execute(
> - 'UPDATE mailinglist SET archive = {0}, archive_private = {1} '
> - 'WHERE id = 1;'.format(self._database.FALSE,
> - self._database.TRUE))
> - # Complete the migration
> - self._upgrade()
> - with temporary_db(self._database):
> - mlist = getUtility(IListManager).get('test(a)example.com')
> - self.assertEqual(mlist.archive_policy, ArchivePolicy.never)
> -
> - def test_archive_policy_private(self):
> - # Test that the new archive_policy value is updated correctly for
> - # private archives.
> - self._database.store.execute(
> - 'UPDATE mailinglist SET archive = {0}, archive_private = {0} '
> - 'WHERE id = 1;'.format(self._database.TRUE))
> - # Complete the migration
> - self._upgrade()
> - with temporary_db(self._database):
> - mlist = getUtility(IListManager).get('test(a)example.com')
> - self.assertEqual(mlist.archive_policy, ArchivePolicy.private)
> -
> - def test_archive_policy_public(self):
> - # Test that the new archive_policy value is updated correctly for
> - # public archives.
> - self._database.store.execute(
> - 'UPDATE mailinglist SET archive = {1}, archive_private = {0} '
> - 'WHERE id = 1;'.format(self._database.FALSE,
> - self._database.TRUE))
> - # Complete the migration
> - self._upgrade()
> - with temporary_db(self._database):
> - mlist = getUtility(IListManager).get('test(a)example.com')
> - self.assertEqual(mlist.archive_policy, ArchivePolicy.public)
> -
> - def test_list_id(self):
> - # Test that the mailinglist table gets a list_id column.
> - self._upgrade()
> - with temporary_db(self._database):
> - mlist = getUtility(IListManager).get('test(a)example.com')
> - self.assertEqual(mlist.list_id, 'test.example.com')
> -
> - def test_list_id_member(self):
> - # Test that the member table's mailing_list column becomes list_id.
> - self._upgrade()
> - with temporary_db(self._database):
> - service = getUtility(ISubscriptionService)
> - members = list(service.find_members(list_id='test.example.com'))
> - self.assertEqual(len(members), 4)
> -
> - def test_news_moderation_none(self):
> - # Test that news_moderation becomes newsgroup_moderation.
> - self._database.store.execute(
> - 'UPDATE mailinglist SET news_moderation = 0 '
> - 'WHERE id = 1;')
> - self._upgrade()
> - with temporary_db(self._database):
> - mlist = getUtility(IListManager).get('test(a)example.com')
> - self.assertEqual(mlist.newsgroup_moderation,
> - NewsgroupModeration.none)
> -
> - def test_news_moderation_open_moderated(self):
> - # Test that news_moderation becomes newsgroup_moderation.
> - self._database.store.execute(
> - 'UPDATE mailinglist SET news_moderation = 1 '
> - 'WHERE id = 1;')
> - self._upgrade()
> - with temporary_db(self._database):
> - mlist = getUtility(IListManager).get('test(a)example.com')
> - self.assertEqual(mlist.newsgroup_moderation,
> - NewsgroupModeration.open_moderated)
> -
> - def test_news_moderation_moderated(self):
> - # Test that news_moderation becomes newsgroup_moderation.
> - self._database.store.execute(
> - 'UPDATE mailinglist SET news_moderation = 2 '
> - 'WHERE id = 1;')
> - self._upgrade()
> - with temporary_db(self._database):
> - mlist = getUtility(IListManager).get('test(a)example.com')
> - self.assertEqual(mlist.newsgroup_moderation,
> - NewsgroupModeration.moderated)
> -
> - def test_nntp_prefix_subject_too_false(self):
> - # Test that news_prefix_subject_too becomes nntp_prefix_subject_too.
> - self._database.store.execute(
> - 'UPDATE mailinglist SET news_prefix_subject_too = {0} '
> - 'WHERE id = 1;'.format(self._database.FALSE))
> - self._upgrade()
> - with temporary_db(self._database):
> - mlist = getUtility(IListManager).get('test(a)example.com')
> - self.assertFalse(mlist.nntp_prefix_subject_too)
> -
> - def test_nntp_prefix_subject_too_true(self):
> - # Test that news_prefix_subject_too becomes nntp_prefix_subject_too.
> - self._database.store.execute(
> - 'UPDATE mailinglist SET news_prefix_subject_too = {0} '
> - 'WHERE id = 1;'.format(self._database.TRUE))
> - self._upgrade()
> - with temporary_db(self._database):
> - mlist = getUtility(IListManager).get('test(a)example.com')
> - self.assertTrue(mlist.nntp_prefix_subject_too)
> -
> - def test_allow_list_posts_false(self):
> - # Test that include_list_post_header -> allow_list_posts.
> - self._database.store.execute(
> - 'UPDATE mailinglist SET include_list_post_header = {0} '
> - 'WHERE id = 1;'.format(self._database.FALSE))
> - self._upgrade()
> - with temporary_db(self._database):
> - mlist = getUtility(IListManager).get('test(a)example.com')
> - self.assertFalse(mlist.allow_list_posts)
> -
> - def test_allow_list_posts_true(self):
> - # Test that include_list_post_header -> allow_list_posts.
> - self._database.store.execute(
> - 'UPDATE mailinglist SET include_list_post_header = {0} '
> - 'WHERE id = 1;'.format(self._database.TRUE))
> - self._upgrade()
> - with temporary_db(self._database):
> - mlist = getUtility(IListManager).get('test(a)example.com')
> - self.assertTrue(mlist.allow_list_posts)
> -
> -
> -
> -class TestMigration20121015Schema(MigrationTestBase):
> - """Test column migrations."""
> -
> - def test_pre_upgrade_column_migrations(self):
> - self._missing_present('ban',
> - ['20121014999999'],
> - ('list_id',),
> - ('mailing_list',))
> - self._missing_present('mailinglist',
> - ['20121014999999'],
> - (),
> - ('new_member_options', 'send_reminders',
> - 'subscribe_policy', 'unsubscribe_policy',
> - 'subscribe_auto_approval', 'private_roster',
> - 'admin_member_chunksize'),
> - )
> -
> - def test_post_upgrade_column_migrations(self):
> - self._missing_present('ban',
> - ['20121014999999',
> - '20121015000000'],
> - ('mailing_list',),
> - ('list_id',))
> - self._missing_present('mailinglist',
> - ['20121014999999',
> - '20121015000000'],
> - ('new_member_options', 'send_reminders',
> - 'subscribe_policy', 'unsubscribe_policy',
> - 'subscribe_auto_approval', 'private_roster',
> - 'admin_member_chunksize'),
> - ())
> -
> -
> -
> -class TestMigration20121015MigratedData(MigrationTestBase):
> - """Test non-migrated data."""
> -
> - def test_migration_bans(self):
> - # Load all the migrations to just before the one we're testing.
> - self._database.load_migrations('20121014999999')
> - # Insert a list-specific ban.
> - self._database.store.execute("""
> - INSERT INTO ban VALUES (
> - 1, 'anne(a)example.com', 'test(a)example.com');
> - """)
> - # Insert a global ban.
> - self._database.store.execute("""
> - INSERT INTO ban VALUES (
> - 2, 'bart(a)example.com', NULL);
> - """)
> - # Update to the current migration we're testing.
> - self._database.load_migrations('20121015000000')
> - # Now both the local and global bans should still be present.
> - bans = sorted(self._database.store.find(Ban),
> - key=attrgetter('email'))
> - self.assertEqual(bans[0].email, 'anne(a)example.com')
> - self.assertEqual(bans[0].list_id, 'test.example.com')
> - self.assertEqual(bans[1].email, 'bart(a)example.com')
> - self.assertEqual(bans[1].list_id, None)
> -
> -
> -
> -class TestMigration20130406Schema(MigrationTestBase):
> - """Test column migrations."""
> -
> - def test_pre_upgrade_column_migrations(self):
> - self._missing_present('bounceevent',
> - ['20130405999999'],
> - ('list_id',),
> - ('list_name',))
> -
> - def test_post_upgrade_column_migrations(self):
> - self._missing_present('bounceevent',
> - ['20130405999999',
> - '20130406000000'],
> - ('list_name',),
> - ('list_id',))
> -
> - def test_pre_listarchiver_table(self):
> - self._table_missing_present(['20130405999999'], ('listarchiver',), ())
> -
> - def test_post_listarchiver_table(self):
> - self._table_missing_present(['20130405999999',
> - '20130406000000'],
> - (),
> - ('listarchiver',))
> -
> -
> -
> -class TestMigration20130406MigratedData(MigrationTestBase):
> - """Test migrated data."""
> -
> - def test_migration_bounceevent(self):
> - # Load all migrations to just before the one we're testing.
> - self._database.load_migrations('20130405999999')
> - # Insert a bounce event.
> - self._database.store.execute("""
> - INSERT INTO bounceevent VALUES (
> - 1, 'test(a)example.com', 'anne(a)example.com',
> - '2013-04-06 21:12:00', '<abc(a)example.com>',
> - 1, 0);
> - """)
> - # Update to the current migration we're testing
> - self._database.load_migrations('20130406000000')
> - # The bounce event should exist, but with a list-id instead of a fqdn
> - # list name.
> - events = list(self._database.store.find(BounceEvent))
> - self.assertEqual(len(events), 1)
> - self.assertEqual(events[0].list_id, 'test.example.com')
> - self.assertEqual(events[0].email, 'anne(a)example.com')
> - self.assertEqual(events[0].timestamp, datetime(2013, 4, 6, 21, 12))
> - self.assertEqual(events[0].message_id, '<abc(a)example.com>')
> - self.assertEqual(events[0].context, BounceContext.normal)
> - self.assertFalse(events[0].processed)
>
> === modified file 'src/mailman/database/types.py'
> --- src/mailman/database/types.py 2014-04-28 15:23:35 +0000
> +++ src/mailman/database/types.py 2014-10-14 01:13:52 +0000
> @@ -23,43 +23,70 @@
> __metaclass__ = type
> __all__ = [
> 'Enum',
> + 'UUID',
> ]
>
> +import uuid
>
> -from storm.properties import SimpleProperty
> -from storm.variables import Variable
> +from sqlalchemy import Integer
> +from sqlalchemy.dialects import postgresql
> +from sqlalchemy.types import TypeDecorator, CHAR
>
>
>
> -class _EnumVariable(Variable):
> - """Storm variable for supporting enum types.
> +class Enum(TypeDecorator):
> + """Handle Python 3.4 style enums.
>
> - To use this, make the database column a INTEGER.
> + Stores an integer-based Enum as an integer in the database, and
> + converts it on-the-fly.
> """
> -
> - def __init__(self, *args, **kws):
> - self._enum = kws.pop('enum')
> - super(_EnumVariable, self).__init__(*args, **kws)
> -
> - def parse_set(self, value, from_db):
> - if value is None:
> - return None
> - if not from_db:
> - return value
> - return self._enum(value)
> -
> - def parse_get(self, value, to_db):
> - if value is None:
> - return None
> - if not to_db:
> - return value
> + impl = Integer
> +
> + def __init__(self, enum, *args, **kw):
> + self.enum = enum
> + super(Enum, self).__init__(*args, **kw)
> +
> + def process_bind_param(self, value, dialect):
> + if value is None:
> + return None
> return value.value
>
> -
> -class Enum(SimpleProperty):
> - """Custom type for Storm supporting enums."""
> -
> - variable_class = _EnumVariable
> -
> - def __init__(self, enum=None):
> - super(Enum, self).__init__(enum=enum)
> + def process_result_value(self, value, dialect):
> + if value is None:
> + return None
> + return self.enum(value)
> +
> +
> +
> +class UUID(TypeDecorator):
> + """Platform-independent GUID type.
> +
> + Uses Postgresql's UUID type, otherwise uses
> + CHAR(32), storing as stringified hex values.
> +
> + """
> + impl = CHAR
> +
> + def load_dialect_impl(self, dialect):
> + if dialect.name == 'postgresql':
> + return dialect.type_descriptor(postgresql.UUID())
> + else:
> + return dialect.type_descriptor(CHAR(32))
> +
> + def process_bind_param(self, value, dialect):
> + if value is None:
> + return value
> + elif dialect.name == 'postgresql':
> + return str(value)
> + else:
> + if not isinstance(value, uuid.UUID):
> + return "%.32x" % uuid.UUID(value)
> + else:
> + # hexstring
> + return "%.32x" % value
> +
> + def process_result_value(self, value, dialect):
> + if value is None:
> + return value
> + else:
> + return uuid.UUID(value)
>
> === modified file 'src/mailman/handlers/docs/owner-recips.rst'
> --- src/mailman/handlers/docs/owner-recips.rst 2012-03-23 20:34:54 +0000
> +++ src/mailman/handlers/docs/owner-recips.rst 2014-10-14 01:13:52 +0000
> @@ -41,7 +41,7 @@
> >>> handler.process(mlist_1, msg, msgdata)
> >>> dump_list(msgdata['recipients'])
> bart(a)example.com
> -
> +
> If Bart also disables his owner delivery, then no one could contact the list's
> owners. Since this is unacceptable, the site owner is used as a fallback.
>
> @@ -55,7 +55,7 @@
> a fallback.
>
> >>> mlist_2 = create_list('beta(a)example.com')
> - >>> mlist_2.administrators.member_count
> + >>> print(mlist_2.administrators.member_count)
> 0
> >>> msgdata = {}
> >>> handler.process(mlist_2, msg, msgdata)
>
> === modified file 'src/mailman/interfaces/database.py'
> --- src/mailman/interfaces/database.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/interfaces/database.py 2014-10-14 01:13:52 +0000
> @@ -24,7 +24,6 @@
> 'DatabaseError',
> 'IDatabase',
> 'IDatabaseFactory',
> - 'ITemporaryDatabase',
> ]
>
>
> @@ -61,12 +60,7 @@
> """Abort the current transaction."""
>
> store = Attribute(
> - """The underlying Storm store on which you can do queries.""")
> -
> -
> -
> -class ITemporaryDatabase(Interface):
> - """Marker interface for test suite adaptation."""
> + """The underlying database object on which you can do queries.""")
>
>
>
>
> === modified file 'src/mailman/interfaces/messages.py'
> --- src/mailman/interfaces/messages.py 2014-04-28 15:23:35 +0000
> +++ src/mailman/interfaces/messages.py 2014-10-14 01:13:52 +0000
> @@ -83,7 +83,7 @@
>
> def get_message_by_hash(message_id_hash):
> """Return the message with the matching X-Message-ID-Hash.
> -
> +
> :param message_id_hash: The X-Message-ID-Hash header contents to
> search for.
> :returns: The message, or None if no matching message was found.
>
> === modified file 'src/mailman/model/address.py'
> --- src/mailman/model/address.py 2014-04-15 03:00:41 +0000
> +++ src/mailman/model/address.py 2014-10-14 01:13:52 +0000
> @@ -26,7 +26,9 @@
>
>
> from email.utils import formataddr
> -from storm.locals import DateTime, Int, Reference, Unicode
> +from sqlalchemy import (
> + Column, DateTime, ForeignKey, Integer, Unicode)
> +from sqlalchemy.orm import relationship, backref
> from zope.component import getUtility
> from zope.event import notify
> from zope.interface import implementer
> @@ -42,17 +44,20 @@
> class Address(Model):
> """See `IAddress`."""
>
> - id = Int(primary=True)
> - email = Unicode()
> - _original = Unicode()
> - display_name = Unicode()
> - _verified_on = DateTime(name='verified_on')
> - registered_on = DateTime()
> -
> - user_id = Int()
> - user = Reference(user_id, 'User.id')
> - preferences_id = Int()
> - preferences = Reference(preferences_id, 'Preferences.id')
> + __tablename__ = 'address'
> +
> + id = Column(Integer, primary_key=True)
> + email = Column(Unicode)
> + _original = Column(Unicode)
> + display_name = Column(Unicode)
> + _verified_on = Column('verified_on', DateTime)
> + registered_on = Column(DateTime)
> +
> + user_id = Column(Integer, ForeignKey('user.id'), index=True)
> +
> + preferences_id = Column(Integer, ForeignKey('preferences.id'), index=True)
> + preferences = relationship(
> + 'Preferences', backref=backref('address', uselist=False))
>
> def __init__(self, email, display_name):
> super(Address, self).__init__()
>
> === modified file 'src/mailman/model/autorespond.py'
> --- src/mailman/model/autorespond.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/model/autorespond.py 2014-10-14 01:13:52 +0000
> @@ -26,7 +26,9 @@
> ]
>
>
> -from storm.locals import And, Date, Desc, Int, Reference
> +from sqlalchemy import Column, Date, ForeignKey, Integer
> +from sqlalchemy import desc
> +from sqlalchemy.orm import relationship
> from zope.interface import implementer
>
> from mailman.database.model import Model
> @@ -42,16 +44,18 @@
> class AutoResponseRecord(Model):
> """See `IAutoResponseRecord`."""
>
> - id = Int(primary=True)
> -
> - address_id = Int()
> - address = Reference(address_id, 'Address.id')
> -
> - mailing_list_id = Int()
> - mailing_list = Reference(mailing_list_id, 'MailingList.id')
> -
> - response_type = Enum(Response)
> - date_sent = Date()
> + __tablename__ = 'autoresponserecord'
> +
> + id = Column(Integer, primary_key=True)
> +
> + address_id = Column(Integer, ForeignKey('address.id'), index=True)
> + address = relationship('Address')
> +
> + mailing_list_id = Column(Integer, ForeignKey('mailinglist.id'), index=True)
> + mailing_list = relationship('MailingList')
> +
> + response_type = Column(Enum(Response))
> + date_sent = Column(Date)
>
> def __init__(self, mailing_list, address, response_type):
> self.mailing_list = mailing_list
> @@ -71,12 +75,11 @@
> @dbconnection
> def todays_count(self, store, address, response_type):
> """See `IAutoResponseSet`."""
> - return store.find(
> - AutoResponseRecord,
> - And(AutoResponseRecord.address == address,
> - AutoResponseRecord.mailing_list == self._mailing_list,
> - AutoResponseRecord.response_type == response_type,
> - AutoResponseRecord.date_sent == today())).count()
> + return store.query(AutoResponseRecord).filter_by(
> + address=address,
> + mailing_list=self._mailing_list,
> + response_type=response_type,
> + date_sent=today()).count()
>
> @dbconnection
> def response_sent(self, store, address, response_type):
> @@ -88,10 +91,9 @@
> @dbconnection
> def last_response(self, store, address, response_type):
> """See `IAutoResponseSet`."""
> - results = store.find(
> - AutoResponseRecord,
> - And(AutoResponseRecord.address == address,
> - AutoResponseRecord.mailing_list == self._mailing_list,
> - AutoResponseRecord.response_type == response_type)
> - ).order_by(Desc(AutoResponseRecord.date_sent))
> + results = store.query(AutoResponseRecord).filter_by(
> + address=address,
> + mailing_list=self._mailing_list,
> + response_type=response_type
> + ).order_by(desc(AutoResponseRecord.date_sent))
> return (None if results.count() == 0 else results.first())
>
> === modified file 'src/mailman/model/bans.py'
> --- src/mailman/model/bans.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/model/bans.py 2014-10-14 01:13:52 +0000
> @@ -27,7 +27,7 @@
>
> import re
>
> -from storm.locals import Int, Unicode
> +from sqlalchemy import Column, Integer, Unicode
> from zope.interface import implementer
>
> from mailman.database.model import Model
> @@ -40,9 +40,11 @@
> class Ban(Model):
> """See `IBan`."""
>
> - id = Int(primary=True)
> - email = Unicode()
> - list_id = Unicode()
> + __tablename__ = 'ban'
> +
> + id = Column(Integer, primary_key=True)
> + email = Column(Unicode)
> + list_id = Column(Unicode)
>
> def __init__(self, email, list_id):
> super(Ban, self).__init__()
> @@ -62,7 +64,7 @@
> @dbconnection
> def ban(self, store, email):
> """See `IBanManager`."""
> - bans = store.find(Ban, email=email, list_id=self._list_id)
> + bans = store.query(Ban).filter_by(email=email, list_id=self._list_id)
> if bans.count() == 0:
> ban = Ban(email, self._list_id)
> store.add(ban)
> @@ -70,9 +72,10 @@
> @dbconnection
> def unban(self, store, email):
> """See `IBanManager`."""
> - ban = store.find(Ban, email=email, list_id=self._list_id).one()
> + ban = store.query(Ban).filter_by(
> + email=email, list_id=self._list_id).first()
> if ban is not None:
> - store.remove(ban)
> + store.delete(ban)
>
> @dbconnection
> def is_banned(self, store, email):
> @@ -81,32 +84,32 @@
> if list_id is None:
> # The client is asking for global bans. Look up bans on the
> # specific email address first.
> - bans = store.find(Ban, email=email, list_id=None)
> + bans = store.query(Ban).filter_by(email=email, list_id=None)
> if bans.count() > 0:
> return True
> # And now look for global pattern bans.
> - bans = store.find(Ban, list_id=None)
> + bans = store.query(Ban).filter_by(list_id=None)
> for ban in bans:
> if (ban.email.startswith('^') and
> re.match(ban.email, email, re.IGNORECASE) is not None):
> return True
> else:
> # This is a list-specific ban.
> - bans = store.find(Ban, email=email, list_id=list_id)
> + bans = store.query(Ban).filter_by(email=email, list_id=list_id)
> if bans.count() > 0:
> return True
> # Try global bans next.
> - bans = store.find(Ban, email=email, list_id=None)
> + bans = store.query(Ban).filter_by(email=email, list_id=None)
> if bans.count() > 0:
> return True
> # Now try specific mailing list bans, but with a pattern.
> - bans = store.find(Ban, list_id=list_id)
> + bans = store.query(Ban).filter_by(list_id=list_id)
> for ban in bans:
> if (ban.email.startswith('^') and
> re.match(ban.email, email, re.IGNORECASE) is not None):
> return True
> # And now try global pattern bans.
> - bans = store.find(Ban, list_id=None)
> + bans = store.query(Ban).filter_by(list_id=None)
> for ban in bans:
> if (ban.email.startswith('^') and
> re.match(ban.email, email, re.IGNORECASE) is not None):
>
> === modified file 'src/mailman/model/bounce.py'
> --- src/mailman/model/bounce.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/model/bounce.py 2014-10-14 01:13:52 +0000
> @@ -26,7 +26,8 @@
> ]
>
>
> -from storm.locals import Bool, Int, DateTime, Unicode
> +
> +from sqlalchemy import Boolean, Column, DateTime, Integer, Unicode
> from zope.interface import implementer
>
> from mailman.database.model import Model
> @@ -42,13 +43,15 @@
> class BounceEvent(Model):
> """See `IBounceEvent`."""
>
> - id = Int(primary=True)
> - list_id = Unicode()
> - email = Unicode()
> - timestamp = DateTime()
> - message_id = Unicode()
> - context = Enum(BounceContext)
> - processed = Bool()
> + __tablename__ = 'bounceevent'
> +
> + id = Column(Integer, primary_key=True)
> + list_id = Column(Unicode)
> + email = Column(Unicode)
> + timestamp = Column(DateTime)
> + message_id = Column(Unicode)
> + context = Column(Enum(BounceContext))
> + processed = Column(Boolean)
>
> def __init__(self, list_id, email, msg, context=None):
> self.list_id = list_id
> @@ -75,12 +78,12 @@
> @dbconnection
> def events(self, store):
> """See `IBounceProcessor`."""
> - for event in store.find(BounceEvent):
> + for event in store.query(BounceEvent).all():
> yield event
>
> @property
> @dbconnection
> def unprocessed(self, store):
> """See `IBounceProcessor`."""
> - for event in store.find(BounceEvent, BounceEvent.processed == False):
> + for event in store.query(BounceEvent).filter_by(processed=False):
> yield event
>
> === modified file 'src/mailman/model/digests.py'
> --- src/mailman/model/digests.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/model/digests.py 2014-10-14 01:13:52 +0000
> @@ -25,7 +25,8 @@
> ]
>
>
> -from storm.locals import Int, Reference
> +from sqlalchemy import Column, Integer, ForeignKey
> +from sqlalchemy.orm import relationship
> from zope.interface import implementer
>
> from mailman.database.model import Model
> @@ -39,15 +40,17 @@
> class OneLastDigest(Model):
> """See `IOneLastDigest`."""
>
> - id = Int(primary=True)
> -
> - mailing_list_id = Int()
> - mailing_list = Reference(mailing_list_id, 'MailingList.id')
> -
> - address_id = Int()
> - address = Reference(address_id, 'Address.id')
> -
> - delivery_mode = Enum(DeliveryMode)
> + __tablename__ = 'onelastdigest'
> +
> + id = Column(Integer, primary_key=True)
> +
> + mailing_list_id = Column(Integer, ForeignKey('mailinglist.id'))
> + mailing_list = relationship('MailingList')
> +
> + address_id = Column(Integer, ForeignKey('address.id'))
> + address = relationship('Address')
> +
> + delivery_mode = Column(Enum(DeliveryMode))
>
> def __init__(self, mailing_list, address, delivery_mode):
> self.mailing_list = mailing_list
>
> === modified file 'src/mailman/model/docs/autorespond.rst'
> --- src/mailman/model/docs/autorespond.rst 2014-04-28 15:23:35 +0000
> +++ src/mailman/model/docs/autorespond.rst 2014-10-14 01:13:52 +0000
> @@ -37,34 +37,34 @@
> ... 'aperson(a)example.com')
>
> >>> from mailman.interfaces.autorespond import Response
> - >>> response_set.todays_count(address, Response.hold)
> + >>> print(response_set.todays_count(address, Response.hold))
> 0
> - >>> response_set.todays_count(address, Response.command)
> + >>> print(response_set.todays_count(address, Response.command))
> 0
>
> Using the response set, we can record that a hold response is sent to the
> address.
>
> >>> response_set.response_sent(address, Response.hold)
> - >>> response_set.todays_count(address, Response.hold)
> + >>> print(response_set.todays_count(address, Response.hold))
> 1
> - >>> response_set.todays_count(address, Response.command)
> + >>> print(response_set.todays_count(address, Response.command))
> 0
>
> We can also record that a command response was sent.
>
> >>> response_set.response_sent(address, Response.command)
> - >>> response_set.todays_count(address, Response.hold)
> + >>> print(response_set.todays_count(address, Response.hold))
> 1
> - >>> response_set.todays_count(address, Response.command)
> + >>> print(response_set.todays_count(address, Response.command))
> 1
>
> Let's send one more.
>
> >>> response_set.response_sent(address, Response.command)
> - >>> response_set.todays_count(address, Response.hold)
> + >>> print(response_set.todays_count(address, Response.hold))
> 1
> - >>> response_set.todays_count(address, Response.command)
> + >>> print(response_set.todays_count(address, Response.command))
> 2
>
> Now the day flips over and all the counts reset.
> @@ -73,9 +73,9 @@
> >>> from mailman.utilities.datetime import factory
> >>> factory.fast_forward()
>
> - >>> response_set.todays_count(address, Response.hold)
> + >>> print(response_set.todays_count(address, Response.hold))
> 0
> - >>> response_set.todays_count(address, Response.command)
> + >>> print(response_set.todays_count(address, Response.command))
> 0
>
>
> @@ -110,7 +110,7 @@
>
> >>> address = getUtility(IUserManager).create_address(
> ... 'bperson(a)example.com')
> - >>> response_set.todays_count(address, Response.command)
> + >>> print(response_set.todays_count(address, Response.command))
> 0
> >>> print(response_set.last_response(address, Response.command))
> None
>
> === modified file 'src/mailman/model/docs/messagestore.rst'
> --- src/mailman/model/docs/messagestore.rst 2014-04-28 15:23:35 +0000
> +++ src/mailman/model/docs/messagestore.rst 2014-10-14 01:13:52 +0000
> @@ -28,8 +28,9 @@
> However, if the message has a ``Message-ID`` header, it can be stored.
>
> >>> msg['Message-ID'] = '<87myycy5eh.fsf(a)uwakimon.sk.tsukuba.ac.jp>'
> - >>> message_store.add(msg)
> - 'AGDWSNXXKCWEILKKNYTBOHRDQGOX3Y35'
> + >>> x_message_id_hash = message_store.add(msg)
> + >>> print(x_message_id_hash)
> + AGDWSNXXKCWEILKKNYTBOHRDQGOX3Y35
> >>> print(msg.as_string())
> Subject: An important message
> Message-ID: <87myycy5eh.fsf(a)uwakimon.sk.tsukuba.ac.jp>
>
> === modified file 'src/mailman/model/docs/requests.rst'
> --- src/mailman/model/docs/requests.rst 2014-04-28 15:23:35 +0000
> +++ src/mailman/model/docs/requests.rst 2014-10-14 01:13:52 +0000
> @@ -35,7 +35,7 @@
>
> The list's requests database starts out empty.
>
> - >>> requests.count
> + >>> print(requests.count)
> 0
> >>> dump_list(requests.held_requests)
> *Empty*
> @@ -68,21 +68,21 @@
>
> We can see the total number of requests being held.
>
> - >>> requests.count
> + >>> print(requests.count)
> 3
>
> We can also see the number of requests being held by request type.
>
> - >>> requests.count_of(RequestType.subscription)
> + >>> print(requests.count_of(RequestType.subscription))
> 1
> - >>> requests.count_of(RequestType.unsubscription)
> + >>> print(requests.count_of(RequestType.unsubscription))
> 1
>
> We can also see when there are multiple held requests of a particular type.
>
> - >>> requests.hold_request(RequestType.held_message, 'hold_4')
> + >>> print(requests.hold_request(RequestType.held_message, 'hold_4'))
> 4
> - >>> requests.count_of(RequestType.held_message)
> + >>> print(requests.count_of(RequestType.held_message))
> 2
>
> We can ask the requests database for a specific request, by providing the id
> @@ -132,7 +132,7 @@
> To make it easier to find specific requests, the list requests can be iterated
> over by type.
>
> - >>> requests.count_of(RequestType.held_message)
> + >>> print(requests.count_of(RequestType.held_message))
> 3
> >>> for request in requests.of_type(RequestType.held_message):
> ... key, data = requests.get_request(request.id)
> @@ -154,10 +154,10 @@
> Once a specific request has been handled, it can be deleted from the requests
> database.
>
> - >>> requests.count
> + >>> print(requests.count)
> 5
> >>> requests.delete_request(2)
> - >>> requests.count
> + >>> print(requests.count)
> 4
>
> Request 2 is no longer in the database.
> @@ -167,5 +167,5 @@
>
> >>> for request in requests.held_requests:
> ... requests.delete_request(request.id)
> - >>> requests.count
> + >>> print(requests.count)
> 0
>
> === modified file 'src/mailman/model/domain.py'
> --- src/mailman/model/domain.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/model/domain.py 2014-10-14 01:13:52 +0000
> @@ -26,8 +26,8 @@
> ]
>
>
> +from sqlalchemy import Column, Integer, Unicode
> from urlparse import urljoin, urlparse
> -from storm.locals import Int, Unicode
> from zope.event import notify
> from zope.interface import implementer
>
> @@ -44,12 +44,14 @@
> class Domain(Model):
> """Domains."""
>
> - id = Int(primary=True)
> -
> - mail_host = Unicode()
> - base_url = Unicode()
> - description = Unicode()
> - contact_address = Unicode()
> + __tablename__ = 'domain'
> +
> + id = Column(Integer, primary_key=True)
> +
> + mail_host = Column(Unicode)
> + base_url = Column(Unicode)
> + description = Column(Unicode)
> + contact_address = Column(Unicode)
>
> def __init__(self, mail_host,
> description=None,
> @@ -92,8 +94,7 @@
> @dbconnection
> def mailing_lists(self, store):
> """See `IDomain`."""
> - mailing_lists = store.find(
> - MailingList,
> + mailing_lists = store.query(MailingList).filter(
> MailingList.mail_host == self.mail_host)
> for mlist in mailing_lists:
> yield mlist
> @@ -140,14 +141,14 @@
> def remove(self, store, mail_host):
> domain = self[mail_host]
> notify(DomainDeletingEvent(domain))
> - store.remove(domain)
> + store.delete(domain)
> notify(DomainDeletedEvent(mail_host))
> return domain
>
> @dbconnection
> def get(self, store, mail_host, default=None):
> """See `IDomainManager`."""
> - domains = store.find(Domain, mail_host=mail_host)
> + domains = store.query(Domain).filter_by(mail_host=mail_host)
> if domains.count() < 1:
> return default
> assert domains.count() == 1, (
> @@ -164,15 +165,15 @@
>
> @dbconnection
> def __len__(self, store):
> - return store.find(Domain).count()
> + return store.query(Domain).count()
>
> @dbconnection
> def __iter__(self, store):
> """See `IDomainManager`."""
> - for domain in store.find(Domain):
> + for domain in store.query(Domain).all():
> yield domain
>
> @dbconnection
> def __contains__(self, store, mail_host):
> """See `IDomainManager`."""
> - return store.find(Domain, mail_host=mail_host).count() > 0
> + return store.query(Domain).filter_by(mail_host=mail_host).count() > 0
>
> === modified file 'src/mailman/model/language.py'
> --- src/mailman/model/language.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/model/language.py 2014-10-14 01:13:52 +0000
> @@ -25,11 +25,11 @@
> ]
>
>
> -from storm.locals import Int, Unicode
> +from sqlalchemy import Column, Integer, Unicode
> from zope.interface import implementer
>
> -from mailman.database import Model
> -from mailman.interfaces import ILanguage
> +from mailman.database.model import Model
> +from mailman.interfaces.languages import ILanguage
>
>
>
> @@ -37,5 +37,7 @@
> class Language(Model):
> """See `ILanguage`."""
>
> - id = Int(primary=True)
> - code = Unicode()
> + __tablename__ = 'language'
> +
> + id = Column(Integer, primary_key=True)
> + code = Column(Unicode)
>
> === modified file 'src/mailman/model/listmanager.py'
> --- src/mailman/model/listmanager.py 2014-04-14 16:14:13 +0000
> +++ src/mailman/model/listmanager.py 2014-10-14 01:13:52 +0000
> @@ -52,9 +52,7 @@
> raise InvalidEmailAddressError(fqdn_listname)
> list_id = '{0}.{1}'.format(listname, hostname)
> notify(ListCreatingEvent(fqdn_listname))
> - mlist = store.find(
> - MailingList,
> - MailingList._list_id == list_id).one()
> + mlist = store.query(MailingList).filter_by(_list_id=list_id).first()
> if mlist:
> raise ListAlreadyExistsError(fqdn_listname)
> mlist = MailingList(fqdn_listname)
> @@ -68,40 +66,40 @@
> """See `IListManager`."""
> listname, at, hostname = fqdn_listname.partition('@')
> list_id = '{0}.{1}'.format(listname, hostname)
> - return store.find(MailingList, MailingList._list_id == list_id).one()
> + return store.query(MailingList).filter_by(_list_id=list_id).first()
>
> @dbconnection
> def get_by_list_id(self, store, list_id):
> """See `IListManager`."""
> - return store.find(MailingList, MailingList._list_id == list_id).one()
> + return store.query(MailingList).filter_by(_list_id=list_id).first()
>
> @dbconnection
> def delete(self, store, mlist):
> """See `IListManager`."""
> fqdn_listname = mlist.fqdn_listname
> notify(ListDeletingEvent(mlist))
> - store.find(ContentFilter, ContentFilter.mailing_list == mlist).remove()
> - store.remove(mlist)
> + store.query(ContentFilter).filter_by(mailing_list=mlist).delete()
> + store.delete(mlist)
> notify(ListDeletedEvent(fqdn_listname))
>
> @property
> @dbconnection
> def mailing_lists(self, store):
> """See `IListManager`."""
> - for mlist in store.find(MailingList):
> + for mlist in store.query(MailingList).all():
> yield mlist
>
> @dbconnection
> def __iter__(self, store):
> """See `IListManager`."""
> - for mlist in store.find(MailingList):
> + for mlist in store.query(MailingList).all():
> yield mlist
>
> @property
> @dbconnection
> def names(self, store):
> """See `IListManager`."""
> - result_set = store.find(MailingList)
> + result_set = store.query(MailingList)
> for mail_host, list_name in result_set.values(MailingList.mail_host,
> MailingList.list_name):
> yield '{0}@{1}'.format(list_name, mail_host)
> @@ -110,15 +108,16 @@
> @dbconnection
> def list_ids(self, store):
> """See `IListManager`."""
> - result_set = store.find(MailingList)
> + result_set = store.query(MailingList)
> for list_id in result_set.values(MailingList._list_id):
> - yield list_id
> + assert isinstance(list_id, tuple) and len(list_id) == 1
> + yield list_id[0]
>
> @property
> @dbconnection
> def name_components(self, store):
> """See `IListManager`."""
> - result_set = store.find(MailingList)
> + result_set = store.query(MailingList)
> for mail_host, list_name in result_set.values(MailingList.mail_host,
> MailingList.list_name):
> yield list_name, mail_host
>
> === modified file 'src/mailman/model/mailinglist.py'
> --- src/mailman/model/mailinglist.py 2014-04-14 16:14:13 +0000
> +++ src/mailman/model/mailinglist.py 2014-10-14 01:13:52 +0000
> @@ -27,9 +27,11 @@
>
> import os
>
> -from storm.locals import (
> - And, Bool, DateTime, Float, Int, Pickle, RawStr, Reference, Store,
> - TimeDelta, Unicode)
> +from sqlalchemy import (
> + Boolean, Column, DateTime, Float, ForeignKey, Integer, Interval,
> + LargeBinary, PickleType, Unicode)
> +from sqlalchemy.event import listen
> +from sqlalchemy.orm import relationship
> from urlparse import urljoin
> from zope.component import getUtility
> from zope.event import notify
> @@ -37,6 +39,7 @@
>
> from mailman.config import config
> from mailman.database.model import Model
> +from mailman.database.transaction import dbconnection
> from mailman.database.types import Enum
> from mailman.interfaces.action import Action, FilterAction
> from mailman.interfaces.address import IAddress
> @@ -73,121 +76,121 @@
> class MailingList(Model):
> """See `IMailingList`."""
>
> - id = Int(primary=True)
> + __tablename__ = 'mailinglist'
> +
> + id = Column(Integer, primary_key=True)
>
> # XXX denotes attributes that should be part of the public interface but
> # are currently missing.
>
> # List identity
> - list_name = Unicode()
> - mail_host = Unicode()
> - _list_id = Unicode(name='list_id')
> - allow_list_posts = Bool()
> - include_rfc2369_headers = Bool()
> - advertised = Bool()
> - anonymous_list = Bool()
> + list_name = Column(Unicode)
> + mail_host = Column(Unicode)
> + _list_id = Column('list_id', Unicode)
> + allow_list_posts = Column(Boolean)
> + include_rfc2369_headers = Column(Boolean)
> + advertised = Column(Boolean)
> + anonymous_list = Column(Boolean)
> # Attributes not directly modifiable via the web u/i
> - created_at = DateTime()
> - # Attributes which are directly modifiable via the web u/i. The more
> - # complicated attributes are currently stored as pickles, though that
> - # will change as the schema and implementation is developed.
> - next_request_id = Int()
> - next_digest_number = Int()
> - digest_last_sent_at = DateTime()
> - volume = Int()
> - last_post_at = DateTime()
> - # Implicit destination.
> - acceptable_aliases_id = Int()
> - acceptable_alias = Reference(acceptable_aliases_id, 'AcceptableAlias.id')
> - # Attributes which are directly modifiable via the web u/i. The more
> - # complicated attributes are currently stored as pickles, though that
> - # will change as the schema and implementation is developed.
> - accept_these_nonmembers = Pickle() # XXX
> - admin_immed_notify = Bool()
> - admin_notify_mchanges = Bool()
> - administrivia = Bool()
> - archive_policy = Enum(ArchivePolicy)
> + created_at = Column(DateTime)
> + # Attributes which are directly modifiable via the web u/i. The more
> + # complicated attributes are currently stored as pickles, though that
> + # will change as the schema and implementation is developed.
> + next_request_id = Column(Integer)
> + next_digest_number = Column(Integer)
> + digest_last_sent_at = Column(DateTime)
> + volume = Column(Integer)
> + last_post_at = Column(DateTime)
> + # Attributes which are directly modifiable via the web u/i. The more
> + # complicated attributes are currently stored as pickles, though that
> + # will change as the schema and implementation is developed.
> + accept_these_nonmembers = Column(PickleType) # XXX
> + admin_immed_notify = Column(Boolean)
> + admin_notify_mchanges = Column(Boolean)
> + administrivia = Column(Boolean)
> + archive_policy = Column(Enum(ArchivePolicy))
> # Automatic responses.
> - autoresponse_grace_period = TimeDelta()
> - autorespond_owner = Enum(ResponseAction)
> - autoresponse_owner_text = Unicode()
> - autorespond_postings = Enum(ResponseAction)
> - autoresponse_postings_text = Unicode()
> - autorespond_requests = Enum(ResponseAction)
> - autoresponse_request_text = Unicode()
> + autoresponse_grace_period = Column(Interval)
> + autorespond_owner = Column(Enum(ResponseAction))
> + autoresponse_owner_text = Column(Unicode)
> + autorespond_postings = Column(Enum(ResponseAction))
> + autoresponse_postings_text = Column(Unicode)
> + autorespond_requests = Column(Enum(ResponseAction))
> + autoresponse_request_text = Column(Unicode)
> # Content filters.
> - filter_action = Enum(FilterAction)
> - filter_content = Bool()
> - collapse_alternatives = Bool()
> - convert_html_to_plaintext = Bool()
> + filter_action = Column(Enum(FilterAction))
> + filter_content = Column(Boolean)
> + collapse_alternatives = Column(Boolean)
> + convert_html_to_plaintext = Column(Boolean)
> # Bounces.
> - bounce_info_stale_after = TimeDelta() # XXX
> - bounce_matching_headers = Unicode() # XXX
> - bounce_notify_owner_on_disable = Bool() # XXX
> - bounce_notify_owner_on_removal = Bool() # XXX
> - bounce_score_threshold = Int() # XXX
> - bounce_you_are_disabled_warnings = Int() # XXX
> - bounce_you_are_disabled_warnings_interval = TimeDelta() # XXX
> - forward_unrecognized_bounces_to = Enum(UnrecognizedBounceDisposition)
> - process_bounces = Bool()
> + bounce_info_stale_after = Column(Interval) # XXX
> + bounce_matching_headers = Column(Unicode) # XXX
> + bounce_notify_owner_on_disable = Column(Boolean) # XXX
> + bounce_notify_owner_on_removal = Column(Boolean) # XXX
> + bounce_score_threshold = Column(Integer) # XXX
> + bounce_you_are_disabled_warnings = Column(Integer) # XXX
> + bounce_you_are_disabled_warnings_interval = Column(Interval) # XXX
> + forward_unrecognized_bounces_to = Column(
> + Enum(UnrecognizedBounceDisposition))
> + process_bounces = Column(Boolean)
> # Miscellaneous
> - default_member_action = Enum(Action)
> - default_nonmember_action = Enum(Action)
> - description = Unicode()
> - digest_footer_uri = Unicode()
> - digest_header_uri = Unicode()
> - digest_is_default = Bool()
> - digest_send_periodic = Bool()
> - digest_size_threshold = Float()
> - digest_volume_frequency = Enum(DigestFrequency)
> - digestable = Bool()
> - discard_these_nonmembers = Pickle()
> - emergency = Bool()
> - encode_ascii_prefixes = Bool()
> - first_strip_reply_to = Bool()
> - footer_uri = Unicode()
> - forward_auto_discards = Bool()
> - gateway_to_mail = Bool()
> - gateway_to_news = Bool()
> - goodbye_message_uri = Unicode()
> - header_matches = Pickle()
> - header_uri = Unicode()
> - hold_these_nonmembers = Pickle()
> - info = Unicode()
> - linked_newsgroup = Unicode()
> - max_days_to_hold = Int()
> - max_message_size = Int()
> - max_num_recipients = Int()
> - member_moderation_notice = Unicode()
> - mime_is_default_digest = Bool()
> + default_member_action = Column(Enum(Action))
> + default_nonmember_action = Column(Enum(Action))
> + description = Column(Unicode)
> + digest_footer_uri = Column(Unicode)
> + digest_header_uri = Column(Unicode)
> + digest_is_default = Column(Boolean)
> + digest_send_periodic = Column(Boolean)
> + digest_size_threshold = Column(Float)
> + digest_volume_frequency = Column(Enum(DigestFrequency))
> + digestable = Column(Boolean)
> + discard_these_nonmembers = Column(PickleType)
> + emergency = Column(Boolean)
> + encode_ascii_prefixes = Column(Boolean)
> + first_strip_reply_to = Column(Boolean)
> + footer_uri = Column(Unicode)
> + forward_auto_discards = Column(Boolean)
> + gateway_to_mail = Column(Boolean)
> + gateway_to_news = Column(Boolean)
> + goodbye_message_uri = Column(Unicode)
> + header_matches = Column(PickleType)
> + header_uri = Column(Unicode)
> + hold_these_nonmembers = Column(PickleType)
> + info = Column(Unicode)
> + linked_newsgroup = Column(Unicode)
> + max_days_to_hold = Column(Integer)
> + max_message_size = Column(Integer)
> + max_num_recipients = Column(Integer)
> + member_moderation_notice = Column(Unicode)
> + mime_is_default_digest = Column(Boolean)
> # FIXME: There should be no moderator_password
> - moderator_password = RawStr()
> - newsgroup_moderation = Enum(NewsgroupModeration)
> - nntp_prefix_subject_too = Bool()
> - nondigestable = Bool()
> - nonmember_rejection_notice = Unicode()
> - obscure_addresses = Bool()
> - owner_chain = Unicode()
> - owner_pipeline = Unicode()
> - personalize = Enum(Personalization)
> - post_id = Int()
> - posting_chain = Unicode()
> - posting_pipeline = Unicode()
> - _preferred_language = Unicode(name='preferred_language')
> - display_name = Unicode()
> - reject_these_nonmembers = Pickle()
> - reply_goes_to_list = Enum(ReplyToMunging)
> - reply_to_address = Unicode()
> - require_explicit_destination = Bool()
> - respond_to_post_requests = Bool()
> - scrub_nondigest = Bool()
> - send_goodbye_message = Bool()
> - send_welcome_message = Bool()
> - subject_prefix = Unicode()
> - topics = Pickle()
> - topics_bodylines_limit = Int()
> - topics_enabled = Bool()
> - welcome_message_uri = Unicode()
> + moderator_password = Column(LargeBinary) # TODO : was RawStr()
> + newsgroup_moderation = Column(Enum(NewsgroupModeration))
> + nntp_prefix_subject_too = Column(Boolean)
> + nondigestable = Column(Boolean)
> + nonmember_rejection_notice = Column(Unicode)
> + obscure_addresses = Column(Boolean)
> + owner_chain = Column(Unicode)
> + owner_pipeline = Column(Unicode)
> + personalize = Column(Enum(Personalization))
> + post_id = Column(Integer)
> + posting_chain = Column(Unicode)
> + posting_pipeline = Column(Unicode)
> + _preferred_language = Column('preferred_language', Unicode)
> + display_name = Column(Unicode)
> + reject_these_nonmembers = Column(PickleType)
> + reply_goes_to_list = Column(Enum(ReplyToMunging))
> + reply_to_address = Column(Unicode)
> + require_explicit_destination = Column(Boolean)
> + respond_to_post_requests = Column(Boolean)
> + scrub_nondigest = Column(Boolean)
> + send_goodbye_message = Column(Boolean)
> + send_welcome_message = Column(Boolean)
> + subject_prefix = Column(Unicode)
> + topics = Column(PickleType)
> + topics_bodylines_limit = Column(Integer)
> + topics_enabled = Column(Boolean)
> + welcome_message_uri = Column(Unicode)
>
> def __init__(self, fqdn_listname):
> super(MailingList, self).__init__()
> @@ -198,14 +201,15 @@
> self._list_id = '{0}.{1}'.format(listname, hostname)
> # For the pending database
> self.next_request_id = 1
> - # We need to set up the rosters. Normally, this method will get
> - # called when the MailingList object is loaded from the database, but
> - # that's not the case when the constructor is called. So, set up the
> - # rosters explicitly.
> - self.__storm_loaded__()
> + # We need to set up the rosters. Normally, this method will get called
> + # when the MailingList object is loaded from the database, but when the
> + # constructor is called, SQLAlchemy's `load` event isn't triggered.
> + # Thus we need to set up the rosters explicitly.
> + self._post_load()
> makedirs(self.data_path)
>
> - def __storm_loaded__(self):
> + def _post_load(self, *args):
> + # This hooks up to SQLAlchemy's `load` event.
> self.owners = roster.OwnerRoster(self)
> self.moderators = roster.ModeratorRoster(self)
> self.administrators = roster.AdministratorRoster(self)
> @@ -215,6 +219,13 @@
> self.subscribers = roster.Subscribers(self)
> self.nonmembers = roster.NonmemberRoster(self)
>
> + @classmethod
> + def __declare_last__(cls):
> + # SQLAlchemy special directive hook called after mappings are assumed
> + # to be complete. Use this to connect the roster instance creation
> + # method with the SA `load` event.
> + listen(cls, 'load', cls._post_load)
> +
> def __repr__(self):
> return '<mailing list "{0}" at {1:#x}>'.format(
> self.fqdn_listname, id(self))
> @@ -323,42 +334,42 @@
> except AttributeError:
> self._preferred_language = language
>
> - def send_one_last_digest_to(self, address, delivery_mode):
> + @dbconnection
> + def send_one_last_digest_to(self, store, address, delivery_mode):
> """See `IMailingList`."""
> digest = OneLastDigest(self, address, delivery_mode)
> - Store.of(self).add(digest)
> + store.add(digest)
>
> @property
> - def last_digest_recipients(self):
> + @dbconnection
> + def last_digest_recipients(self, store):
> """See `IMailingList`."""
> - results = Store.of(self).find(
> - OneLastDigest,
> + results = store.query(OneLastDigest).filter(
> OneLastDigest.mailing_list == self)
> recipients = [(digest.address, digest.delivery_mode)
> for digest in results]
> - results.remove()
> + results.delete()
> return recipients
>
> @property
> - def filter_types(self):
> + @dbconnection
> + def filter_types(self, store):
> """See `IMailingList`."""
> - results = Store.of(self).find(
> - ContentFilter,
> - And(ContentFilter.mailing_list == self,
> - ContentFilter.filter_type == FilterType.filter_mime))
> + results = store.query(ContentFilter).filter(
> + ContentFilter.mailing_list == self,
> + ContentFilter.filter_type == FilterType.filter_mime)
> for content_filter in results:
> yield content_filter.filter_pattern
>
> @filter_types.setter
> - def filter_types(self, sequence):
> + @dbconnection
> + def filter_types(self, store, sequence):
> """See `IMailingList`."""
> # First, delete all existing MIME type filter patterns.
> - store = Store.of(self)
> - results = store.find(
> - ContentFilter,
> - And(ContentFilter.mailing_list == self,
> - ContentFilter.filter_type == FilterType.filter_mime))
> - results.remove()
> + results = store.query(ContentFilter).filter(
> + ContentFilter.mailing_list == self,
> + ContentFilter.filter_type == FilterType.filter_mime)
> + results.delete()
> # Now add all the new filter types.
> for mime_type in sequence:
> content_filter = ContentFilter(
> @@ -366,25 +377,24 @@
> store.add(content_filter)
>
> @property
> - def pass_types(self):
> + @dbconnection
> + def pass_types(self, store):
> """See `IMailingList`."""
> - results = Store.of(self).find(
> - ContentFilter,
> - And(ContentFilter.mailing_list == self,
> - ContentFilter.filter_type == FilterType.pass_mime))
> + results = store.query(ContentFilter).filter(
> + ContentFilter.mailing_list == self,
> + ContentFilter.filter_type == FilterType.pass_mime)
> for content_filter in results:
> yield content_filter.filter_pattern
>
> @pass_types.setter
> - def pass_types(self, sequence):
> + @dbconnection
> + def pass_types(self, store, sequence):
> """See `IMailingList`."""
> # First, delete all existing MIME type pass patterns.
> - store = Store.of(self)
> - results = store.find(
> - ContentFilter,
> - And(ContentFilter.mailing_list == self,
> - ContentFilter.filter_type == FilterType.pass_mime))
> - results.remove()
> + results = store.query(ContentFilter).filter(
> + ContentFilter.mailing_list == self,
> + ContentFilter.filter_type == FilterType.pass_mime)
> + results.delete()
> # Now add all the new filter types.
> for mime_type in sequence:
> content_filter = ContentFilter(
> @@ -392,25 +402,24 @@
> store.add(content_filter)
>
> @property
> - def filter_extensions(self):
> + @dbconnection
> + def filter_extensions(self, store):
> """See `IMailingList`."""
> - results = Store.of(self).find(
> - ContentFilter,
> - And(ContentFilter.mailing_list == self,
> - ContentFilter.filter_type == FilterType.filter_extension))
> + results = store.query(ContentFilter).filter(
> + ContentFilter.mailing_list == self,
> + ContentFilter.filter_type == FilterType.filter_extension)
> for content_filter in results:
> yield content_filter.filter_pattern
>
> @filter_extensions.setter
> - def filter_extensions(self, sequence):
> + @dbconnection
> + def filter_extensions(self, store, sequence):
> """See `IMailingList`."""
> # First, delete all existing file extensions filter patterns.
> - store = Store.of(self)
> - results = store.find(
> - ContentFilter,
> - And(ContentFilter.mailing_list == self,
> - ContentFilter.filter_type == FilterType.filter_extension))
> - results.remove()
> + results = store.query(ContentFilter).filter(
> + ContentFilter.mailing_list == self,
> + ContentFilter.filter_type == FilterType.filter_extension)
> + results.delete()
> # Now add all the new filter types.
> for mime_type in sequence:
> content_filter = ContentFilter(
> @@ -418,25 +427,24 @@
> store.add(content_filter)
>
> @property
> - def pass_extensions(self):
> + @dbconnection
> + def pass_extensions(self, store):
> """See `IMailingList`."""
> - results = Store.of(self).find(
> - ContentFilter,
> - And(ContentFilter.mailing_list == self,
> - ContentFilter.filter_type == FilterType.pass_extension))
> + results = store.query(ContentFilter).filter(
> + ContentFilter.mailing_list == self,
> + ContentFilter.filter_type == FilterType.pass_extension)
> for content_filter in results:
> yield content_filter.pass_pattern
>
> @pass_extensions.setter
> - def pass_extensions(self, sequence):
> + @dbconnection
> + def pass_extensions(self, store, sequence):
> """See `IMailingList`."""
> # First, delete all existing file extensions pass patterns.
> - store = Store.of(self)
> - results = store.find(
> - ContentFilter,
> - And(ContentFilter.mailing_list == self,
> - ContentFilter.filter_type == FilterType.pass_extension))
> - results.remove()
> + results = store.query(ContentFilter).filter(
> + ContentFilter.mailing_list == self,
> + ContentFilter.filter_type == FilterType.pass_extension)
> + results.delete()
> # Now add all the new filter types.
> for mime_type in sequence:
> content_filter = ContentFilter(
> @@ -455,26 +463,24 @@
> raise TypeError(
> 'Undefined MemberRole: {0}'.format(role))
>
> - def subscribe(self, subscriber, role=MemberRole.member):
> + @dbconnection
> + def subscribe(self, store, subscriber, role=MemberRole.member):
> """See `IMailingList`."""
> - store = Store.of(self)
> if IAddress.providedBy(subscriber):
> - member = store.find(
> - Member,
> + member = store.query(Member).filter(
> Member.role == role,
> Member.list_id == self._list_id,
> - Member._address == subscriber).one()
> + Member._address == subscriber).first()
> if member:
> raise AlreadySubscribedError(
> self.fqdn_listname, subscriber.email, role)
> elif IUser.providedBy(subscriber):
> if subscriber.preferred_address is None:
> raise MissingPreferredAddressError(subscriber)
> - member = store.find(
> - Member,
> + member = store.query(Member).filter(
> Member.role == role,
> Member.list_id == self._list_id,
> - Member._user == subscriber).one()
> + Member._user == subscriber).first()
> if member:
> raise AlreadySubscribedError(
> self.fqdn_listname, subscriber, role)
> @@ -494,12 +500,15 @@
> class AcceptableAlias(Model):
> """See `IAcceptableAlias`."""
>
> - id = Int(primary=True)
> -
> - mailing_list_id = Int()
> - mailing_list = Reference(mailing_list_id, MailingList.id)
> -
> - alias = Unicode()
> + __tablename__ = 'acceptablealias'
> +
> + id = Column(Integer, primary_key=True)
> +
> + mailing_list_id = Column(
> + Integer, ForeignKey('mailinglist.id'),
> + index=True, nullable=False)
> + mailing_list = relationship('MailingList', backref='acceptable_alias')
> + alias = Column(Unicode, index=True, nullable=False)
>
> def __init__(self, mailing_list, alias):
> self.mailing_list = mailing_list
> @@ -514,29 +523,30 @@
> def __init__(self, mailing_list):
> self._mailing_list = mailing_list
>
> - def clear(self):
> + @dbconnection
> + def clear(self, store):
> """See `IAcceptableAliasSet`."""
> - Store.of(self._mailing_list).find(
> - AcceptableAlias,
> - AcceptableAlias.mailing_list == self._mailing_list).remove()
> + store.query(AcceptableAlias).filter(
> + AcceptableAlias.mailing_list == self._mailing_list).delete()
>
> - def add(self, alias):
> + @dbconnection
> + def add(self, store, alias):
> if not (alias.startswith('^') or '@' in alias):
> raise ValueError(alias)
> alias = AcceptableAlias(self._mailing_list, alias.lower())
> - Store.of(self._mailing_list).add(alias)
> + store.add(alias)
>
> - def remove(self, alias):
> - Store.of(self._mailing_list).find(
> - AcceptableAlias,
> - And(AcceptableAlias.mailing_list == self._mailing_list,
> - AcceptableAlias.alias == alias.lower())).remove()
> + @dbconnection
> + def remove(self, store, alias):
> + store.query(AcceptableAlias).filter(
> + AcceptableAlias.mailing_list == self._mailing_list,
> + AcceptableAlias.alias == alias.lower()).delete()
>
> @property
> - def aliases(self):
> - aliases = Store.of(self._mailing_list).find(
> - AcceptableAlias,
> - AcceptableAlias.mailing_list == self._mailing_list)
> + @dbconnection
> + def aliases(self, store):
> + aliases = store.query(AcceptableAlias).filter(
> + AcceptableAlias.mailing_list_id == self._mailing_list.id)
> for alias in aliases:
> yield alias.alias
>
> @@ -546,12 +556,17 @@
> class ListArchiver(Model):
> """See `IListArchiver`."""
>
> - id = Int(primary=True)
> -
> - mailing_list_id = Int()
> - mailing_list = Reference(mailing_list_id, MailingList.id)
> - name = Unicode()
> - _is_enabled = Bool()
> + __tablename__ = 'listarchiver'
> +
> + id = Column(Integer, primary_key=True)
> +
> + mailing_list_id = Column(
> + Integer, ForeignKey('mailinglist.id'),
> + index=True, nullable=False)
> + mailing_list = relationship('MailingList')
> +
> + name = Column(Unicode, nullable=False)
> + _is_enabled = Column(Boolean)
>
> def __init__(self, mailing_list, archiver_name, system_archiver):
> self.mailing_list = mailing_list
> @@ -576,32 +591,32 @@
>
> @implementer(IListArchiverSet)
> class ListArchiverSet:
> - def __init__(self, mailing_list):
> + @dbconnection
> + def __init__(self, store, mailing_list):
> self._mailing_list = mailing_list
> system_archivers = {}
> for archiver in config.archivers:
> system_archivers[archiver.name] = archiver
> # Add any system enabled archivers which aren't already associated
> # with the mailing list.
> - store = Store.of(self._mailing_list)
> for archiver_name in system_archivers:
> - exists = store.find(
> - ListArchiver,
> - And(ListArchiver.mailing_list == mailing_list,
> - ListArchiver.name == archiver_name)).one()
> + exists = store.query(ListArchiver).filter(
> + ListArchiver.mailing_list == mailing_list,
> + ListArchiver.name == archiver_name).first()
> if exists is None:
> store.add(ListArchiver(mailing_list, archiver_name,
> system_archivers[archiver_name]))
>
> @property
> - def archivers(self):
> - entries = Store.of(self._mailing_list).find(
> - ListArchiver, ListArchiver.mailing_list == self._mailing_list)
> + @dbconnection
> + def archivers(self, store):
> + entries = store.query(ListArchiver).filter(
> + ListArchiver.mailing_list == self._mailing_list)
> for entry in entries:
> yield entry
>
> - def get(self, archiver_name):
> - return Store.of(self._mailing_list).find(
> - ListArchiver,
> - And(ListArchiver.mailing_list == self._mailing_list,
> - ListArchiver.name == archiver_name)).one()
> + @dbconnection
> + def get(self, store, archiver_name):
> + return store.query(ListArchiver).filter(
> + ListArchiver.mailing_list == self._mailing_list,
> + ListArchiver.name == archiver_name).first()
>
> === modified file 'src/mailman/model/member.py'
> --- src/mailman/model/member.py 2014-03-02 21:38:32 +0000
> +++ src/mailman/model/member.py 2014-10-14 01:13:52 +0000
> @@ -24,8 +24,8 @@
> 'Member',
> ]
>
> -from storm.locals import Int, Reference, Unicode
> -from storm.properties import UUID
> +from sqlalchemy import Column, ForeignKey, Integer, Unicode
> +from sqlalchemy.orm import relationship
> from zope.component import getUtility
> from zope.event import notify
> from zope.interface import implementer
> @@ -33,7 +33,7 @@
> from mailman.core.constants import system_preferences
> from mailman.database.model import Model
> from mailman.database.transaction import dbconnection
> -from mailman.database.types import Enum
> +from mailman.database.types import Enum, UUID
> from mailman.interfaces.action import Action
> from mailman.interfaces.address import IAddress
> from mailman.interfaces.listmanager import IListManager
> @@ -52,18 +52,20 @@
> class Member(Model):
> """See `IMember`."""
>
> - id = Int(primary=True)
> - _member_id = UUID()
> - role = Enum(MemberRole)
> - list_id = Unicode()
> - moderation_action = Enum(Action)
> -
> - address_id = Int()
> - _address = Reference(address_id, 'Address.id')
> - preferences_id = Int()
> - preferences = Reference(preferences_id, 'Preferences.id')
> - user_id = Int()
> - _user = Reference(user_id, 'User.id')
> + __tablename__ = 'member'
> +
> + id = Column(Integer, primary_key=True)
> + _member_id = Column(UUID)
> + role = Column(Enum(MemberRole))
> + list_id = Column(Unicode)
> + moderation_action = Column(Enum(Action))
> +
> + address_id = Column(Integer, ForeignKey('address.id'))
> + _address = relationship('Address')
> + preferences_id = Column(Integer, ForeignKey('preferences.id'))
> + preferences = relationship('Preferences')
> + user_id = Column(Integer, ForeignKey('user.id'))
> + _user = relationship('User')
>
> def __init__(self, role, list_id, subscriber):
> self._member_id = uid_factory.new_uid()
> @@ -198,5 +200,5 @@
> """See `IMember`."""
> # Yes, this must get triggered before self is deleted.
> notify(UnsubscriptionEvent(self.mailing_list, self))
> - store.remove(self.preferences)
> - store.remove(self)
> + store.delete(self.preferences)
> + store.delete(self)
>
> === modified file 'src/mailman/model/message.py'
> --- src/mailman/model/message.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/model/message.py 2014-10-14 01:13:52 +0000
> @@ -24,7 +24,7 @@
> 'Message',
> ]
>
> -from storm.locals import AutoReload, Int, RawStr, Unicode
> +from sqlalchemy import Column, Integer, LargeBinary, Unicode
> from zope.interface import implementer
>
> from mailman.database.model import Model
> @@ -37,11 +37,13 @@
> class Message(Model):
> """A message in the message store."""
>
> - id = Int(primary=True, default=AutoReload)
> - message_id = Unicode()
> - message_id_hash = RawStr()
> - path = RawStr()
> + __tablename__ = 'message'
> +
> + id = Column(Integer, primary_key=True)
> # This is a Messge-ID field representation, not a database row id.
> + message_id = Column(Unicode)
> + message_id_hash = Column(LargeBinary)
> + path = Column(LargeBinary)
>
> @dbconnection
> def __init__(self, store, message_id, message_id_hash, path):
>
> === modified file 'src/mailman/model/messagestore.py'
> --- src/mailman/model/messagestore.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/model/messagestore.py 2014-10-14 01:13:52 +0000
> @@ -54,12 +54,13 @@
> def add(self, store, message):
> # Ensure that the message has the requisite headers.
> message_ids = message.get_all('message-id', [])
> - if len(message_ids) <> 1:
> + if len(message_ids) != 1:
> raise ValueError('Exactly one Message-ID header required')
> # Calculate and insert the X-Message-ID-Hash.
> message_id = message_ids[0]
> # Complain if the Message-ID already exists in the storage.
> - existing = store.find(Message, Message.message_id == message_id).one()
> + existing = store.query(Message).filter(
> + Message.message_id == message_id).first()
> if existing is not None:
> raise ValueError(
> 'Message ID already exists in message store: {0}'.format(
> @@ -80,9 +81,9 @@
> # providing a unique serial number, but to get this information, we
> # have to use a straight insert instead of relying on Elixir to create
> # the object.
> - row = Message(message_id=message_id,
> - message_id_hash=hash32,
> - path=relpath)
> + Message(message_id=message_id,
> + message_id_hash=hash32,
> + path=relpath)
> # Now calculate the full file system path.
> path = os.path.join(config.MESSAGES_DIR, relpath)
> # Write the file to the path, but catch the appropriate exception in
> @@ -95,7 +96,7 @@
> pickle.dump(message, fp, -1)
> break
> except IOError as error:
> - if error.errno <> errno.ENOENT:
> + if error.errno != errno.ENOENT:
> raise
> makedirs(os.path.dirname(path))
> return hash32
> @@ -107,7 +108,7 @@
>
> @dbconnection
> def get_message_by_id(self, store, message_id):
> - row = store.find(Message, message_id=message_id).one()
> + row = store.query(Message).filter_by(message_id=message_id).first()
> if row is None:
> return None
> return self._get_message(row)
> @@ -116,11 +117,11 @@
> def get_message_by_hash(self, store, message_id_hash):
> # It's possible the hash came from a message header, in which case it
> # will be a Unicode. However when coming from source code, it may be
> - # an 8-string. Coerce to the latter if necessary; it must be
> - # US-ASCII.
> - if isinstance(message_id_hash, unicode):
> + # bytes object. Coerce to the latter if necessary; it must be ASCII.
> + if not isinstance(message_id_hash, bytes):
> message_id_hash = message_id_hash.encode('ascii')
> - row = store.find(Message, message_id_hash=message_id_hash).one()
> + row = store.query(Message).filter_by(
> + message_id_hash=message_id_hash).first()
> if row is None:
> return None
> return self._get_message(row)
> @@ -128,14 +129,14 @@
> @property
> @dbconnection
> def messages(self, store):
> - for row in store.find(Message):
> + for row in store.query(Message).all():
> yield self._get_message(row)
>
> @dbconnection
> def delete_message(self, store, message_id):
> - row = store.find(Message, message_id=message_id).one()
> + row = store.query(Message).filter_by(message_id=message_id).first()
> if row is None:
> raise LookupError(message_id)
> path = os.path.join(config.MESSAGES_DIR, row.path)
> os.remove(path)
> - store.remove(row)
> + store.delete(row)
>
> === modified file 'src/mailman/model/mime.py'
> --- src/mailman/model/mime.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/model/mime.py 2014-10-14 01:13:52 +0000
> @@ -25,7 +25,8 @@
> ]
>
>
> -from storm.locals import Int, Reference, Unicode
> +from sqlalchemy import Column, ForeignKey, Integer, Unicode
> +from sqlalchemy.orm import relationship
> from zope.interface import implementer
>
> from mailman.database.model import Model
> @@ -38,13 +39,15 @@
> class ContentFilter(Model):
> """A single filter criteria."""
>
> - id = Int(primary=True)
> -
> - mailing_list_id = Int()
> - mailing_list = Reference(mailing_list_id, 'MailingList.id')
> -
> - filter_type = Enum(FilterType)
> - filter_pattern = Unicode()
> + __tablename__ = 'contentfilter'
> +
> + id = Column(Integer, primary_key=True)
> +
> + mailing_list_id = Column(Integer, ForeignKey('mailinglist.id'), index=True)
> + mailing_list = relationship('MailingList')
> +
> + filter_type = Column(Enum(FilterType))
> + filter_pattern = Column(Unicode)
>
> def __init__(self, mailing_list, filter_pattern, filter_type):
> self.mailing_list = mailing_list
>
> === modified file 'src/mailman/model/pending.py'
> --- src/mailman/model/pending.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/model/pending.py 2014-10-14 01:13:52 +0000
> @@ -31,7 +31,9 @@
> import hashlib
>
> from lazr.config import as_timedelta
> -from storm.locals import DateTime, Int, RawStr, ReferenceSet, Unicode
> +from sqlalchemy import (
> + Column, DateTime, ForeignKey, Integer, LargeBinary, Unicode)
> +from sqlalchemy.orm import relationship
> from zope.interface import implementer
> from zope.interface.verify import verifyObject
>
> @@ -49,31 +51,35 @@
> class PendedKeyValue(Model):
> """A pended key/value pair, tied to a token."""
>
> + __tablename__ = 'pendedkeyvalue'
> +
> + id = Column(Integer, primary_key=True)
> + key = Column(Unicode)
> + value = Column(Unicode)
> + pended_id = Column(Integer, ForeignKey('pended.id'), index=True)
> +
> def __init__(self, key, value):
> self.key = key
> self.value = value
>
> - id = Int(primary=True)
> - key = Unicode()
> - value = Unicode()
> - pended_id = Int()
> -
>
>
> @implementer(IPended)
> class Pended(Model):
> """A pended event, tied to a token."""
>
> + __tablename__ = 'pended'
> +
> + id = Column(Integer, primary_key=True)
> + token = Column(LargeBinary)
> + expiration_date = Column(DateTime)
> + key_values = relationship('PendedKeyValue')
> +
> def __init__(self, token, expiration_date):
> super(Pended, self).__init__()
> self.token = token
> self.expiration_date = expiration_date
>
> - id = Int(primary=True)
> - token = RawStr()
> - expiration_date = DateTime()
> - key_values = ReferenceSet(id, PendedKeyValue.pended_id)
> -
>
>
> @implementer(IPendable)
> @@ -105,7 +111,7 @@
> token = hashlib.sha1(repr(x)).hexdigest()
> # In practice, we'll never get a duplicate, but we'll be anal
> # about checking anyway.
> - if store.find(Pended, token=token).count() == 0:
> + if store.query(Pended).filter_by(token=token).count() == 0:
> break
> else:
> raise AssertionError('Could not find a valid pendings token')
> @@ -114,10 +120,10 @@
> token=token,
> expiration_date=now() + lifetime)
> for key, value in pendable.items():
> - if isinstance(key, str):
> - key = unicode(key, 'utf-8')
> - if isinstance(value, str):
> - value = unicode(value, 'utf-8')
> + if isinstance(key, bytes):
> + key = key.decode('utf-8')
> + if isinstance(value, bytes):
> + value = value.decode('utf-8')
> elif type(value) is int:
> value = '__builtin__.int\1%s' % value
> elif type(value) is float:
> @@ -129,7 +135,7 @@
> value = ('mailman.model.pending.unpack_list\1' +
> '\2'.join(value))
> keyval = PendedKeyValue(key=key, value=value)
> - pending.key_values.add(keyval)
> + pending.key_values.append(keyval)
> store.add(pending)
> return token
>
> @@ -137,7 +143,7 @@
> def confirm(self, store, token, expunge=True):
> # Token can come in as a unicode, but it's stored in the database as
> # bytes. They must be ascii.
> - pendings = store.find(Pended, token=str(token))
> + pendings = store.query(Pended).filter_by(token=str(token))
> if pendings.count() == 0:
> return None
> assert pendings.count() == 1, (
> @@ -146,31 +152,32 @@
> pendable = UnpendedPendable()
> # Find all PendedKeyValue entries that are associated with the pending
> # object's ID. Watch out for type conversions.
> - for keyvalue in store.find(PendedKeyValue,
> - PendedKeyValue.pended_id == pending.id):
> + entries = store.query(PendedKeyValue).filter(
> + PendedKeyValue.pended_id == pending.id)
> + for keyvalue in entries:
> if keyvalue.value is not None and '\1' in keyvalue.value:
> type_name, value = keyvalue.value.split('\1', 1)
> pendable[keyvalue.key] = call_name(type_name, value)
> else:
> pendable[keyvalue.key] = keyvalue.value
> if expunge:
> - store.remove(keyvalue)
> + store.delete(keyvalue)
> if expunge:
> - store.remove(pending)
> + store.delete(pending)
> return pendable
>
> @dbconnection
> def evict(self, store):
> right_now = now()
> - for pending in store.find(Pended):
> + for pending in store.query(Pended).all():
> if pending.expiration_date < right_now:
> # Find all PendedKeyValue entries that are associated with the
> # pending object's ID.
> - q = store.find(PendedKeyValue,
> - PendedKeyValue.pended_id == pending.id)
> + q = store.query(PendedKeyValue).filter(
> + PendedKeyValue.pended_id == pending.id)
> for keyvalue in q:
> - store.remove(keyvalue)
> - store.remove(pending)
> + store.delete(keyvalue)
> + store.delete(pending)
>
>
>
>
> === modified file 'src/mailman/model/preferences.py'
> --- src/mailman/model/preferences.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/model/preferences.py 2014-10-14 01:13:52 +0000
> @@ -25,7 +25,7 @@
> ]
>
>
> -from storm.locals import Bool, Int, Unicode
> +from sqlalchemy import Boolean, Column, Integer, Unicode
> from zope.component import getUtility
> from zope.interface import implementer
>
> @@ -41,14 +41,16 @@
> class Preferences(Model):
> """See `IPreferences`."""
>
> - id = Int(primary=True)
> - acknowledge_posts = Bool()
> - hide_address = Bool()
> - _preferred_language = Unicode(name='preferred_language')
> - receive_list_copy = Bool()
> - receive_own_postings = Bool()
> - delivery_mode = Enum(DeliveryMode)
> - delivery_status = Enum(DeliveryStatus)
> + __tablename__ = 'preferences'
> +
> + id = Column(Integer, primary_key=True)
> + acknowledge_posts = Column(Boolean)
> + hide_address = Column(Boolean)
> + _preferred_language = Column('preferred_language', Unicode)
> + receive_list_copy = Column(Boolean)
> + receive_own_postings = Column(Boolean)
> + delivery_mode = Column(Enum(DeliveryMode))
> + delivery_status = Column(Enum(DeliveryStatus))
>
> def __repr__(self):
> return '<Preferences object at {0:#x}>'.format(id(self))
>
> === modified file 'src/mailman/model/requests.py'
> --- src/mailman/model/requests.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/model/requests.py 2014-10-14 01:13:52 +0000
> @@ -26,7 +26,8 @@
>
> from cPickle import dumps, loads
> from datetime import timedelta
> -from storm.locals import AutoReload, Int, RawStr, Reference, Unicode
> +from sqlalchemy import Column, ForeignKey, Integer, LargeBinary, Unicode
> +from sqlalchemy.orm import relationship
> from zope.component import getUtility
> from zope.interface import implementer
>
> @@ -68,25 +69,25 @@
> @property
> @dbconnection
> def count(self, store):
> - return store.find(_Request, mailing_list=self.mailing_list).count()
> + return store.query(_Request).filter_by(
> + mailing_list=self.mailing_list).count()
>
> @dbconnection
> def count_of(self, store, request_type):
> - return store.find(
> - _Request,
> + return store.query(_Request).filter_by(
> mailing_list=self.mailing_list, request_type=request_type).count()
>
> @property
> @dbconnection
> def held_requests(self, store):
> - results = store.find(_Request, mailing_list=self.mailing_list)
> + results = store.query(_Request).filter_by(
> + mailing_list=self.mailing_list)
> for request in results:
> yield request
>
> @dbconnection
> def of_type(self, store, request_type):
> - results = store.find(
> - _Request,
> + results = store.query(_Request).filter_by(
> mailing_list=self.mailing_list, request_type=request_type)
> for request in results:
> yield request
> @@ -104,11 +105,15 @@
> data_hash = token
> request = _Request(key, request_type, self.mailing_list, data_hash)
> store.add(request)
> + # XXX The caller needs a valid id immediately, so flush the changes
> + # now to the SA transaction context. Otherwise .id would not be
> + # valid. Hopefully this has no unintended side-effects.
> + store.flush()
> return request.id
>
> @dbconnection
> def get_request(self, store, request_id, request_type=None):
> - result = store.get(_Request, request_id)
> + result = store.query(_Request).get(request_id)
> if result is None:
> return None
> if request_type is not None and result.request_type != request_type:
> @@ -117,6 +122,8 @@
> return result.key, None
> pendable = getUtility(IPendings).confirm(
> result.data_hash, expunge=False)
> + if pendable is None:
> + return None
> data = dict()
> # Unpickle any non-Unicode values.
> for key, value in pendable.items():
> @@ -130,25 +137,27 @@
>
> @dbconnection
> def delete_request(self, store, request_id):
> - request = store.get(_Request, request_id)
> + request = store.query(_Request).get(request_id)
> if request is None:
> raise KeyError(request_id)
> # Throw away the pended data.
> getUtility(IPendings).confirm(request.data_hash)
> - store.remove(request)
> + store.delete(request)
>
>
>
> class _Request(Model):
> """Table for mailing list hold requests."""
>
> - id = Int(primary=True, default=AutoReload)
> - key = Unicode()
> - request_type = Enum(RequestType)
> - data_hash = RawStr()
> -
> - mailing_list_id = Int()
> - mailing_list = Reference(mailing_list_id, 'MailingList.id')
> + __tablename__ = '_request'
> +
> + id = Column(Integer, primary_key=True)
> + key = Column(Unicode)
> + request_type = Column(Enum(RequestType))
> + data_hash = Column(LargeBinary)
> +
> + mailing_list_id = Column(Integer, ForeignKey('mailinglist.id'), index=True)
> + mailing_list = relationship('MailingList')
>
> def __init__(self, key, request_type, mailing_list, data_hash):
> super(_Request, self).__init__()
>
> === modified file 'src/mailman/model/roster.py'
> --- src/mailman/model/roster.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/model/roster.py 2014-10-14 01:13:52 +0000
> @@ -37,7 +37,7 @@
> ]
>
>
> -from storm.expr import And, Or
> +from sqlalchemy import and_, or_
> from zope.interface import implementer
>
> from mailman.database.transaction import dbconnection
> @@ -65,8 +65,7 @@
>
> @dbconnection
> def _query(self, store):
> - return store.find(
> - Member,
> + return store.query(Member).filter(
> Member.list_id == self._mlist.list_id,
> Member.role == self.role)
>
> @@ -104,8 +103,7 @@
> @dbconnection
> def get_member(self, store, address):
> """See `IRoster`."""
> - results = store.find(
> - Member,
> + results = store.query(Member).filter(
> Member.list_id == self._mlist.list_id,
> Member.role == self.role,
> Address.email == address,
> @@ -160,20 +158,18 @@
>
> @dbconnection
> def _query(self, store):
> - return store.find(
> - Member,
> + return store.query(Member).filter(
> Member.list_id == self._mlist.list_id,
> - Or(Member.role == MemberRole.owner,
> - Member.role == MemberRole.moderator))
> + or_(Member.role == MemberRole.owner,
> + Member.role == MemberRole.moderator))
>
> @dbconnection
> def get_member(self, store, address):
> """See `IRoster`."""
> - results = store.find(
> - Member,
> + results = store.query(Member).filter(
> Member.list_id == self._mlist.list_id,
> - Or(Member.role == MemberRole.moderator,
> - Member.role == MemberRole.owner),
> + or_(Member.role == MemberRole.moderator,
> + Member.role == MemberRole.owner),
> Address.email == address,
> Member.address_id == Address.id)
> if results.count() == 0:
> @@ -206,10 +202,9 @@
> :return: A generator of members.
> :rtype: generator
> """
> - results = store.find(
> - Member,
> - And(Member.list_id == self._mlist.list_id,
> - Member.role == MemberRole.member))
> + results = store.query(Member).filter_by(
> + list_id = self._mlist.list_id,
> + role = MemberRole.member)
> for member in results:
> if member.delivery_mode in delivery_modes:
> yield member
> @@ -250,7 +245,7 @@
>
> @dbconnection
> def _query(self, store):
> - return store.find(Member, Member.list_id == self._mlist.list_id)
> + return store.query(Member).filter_by(list_id = self._mlist.list_id)
>
>
>
> @@ -265,12 +260,11 @@
>
> @dbconnection
> def _query(self, store):
> - results = store.find(
> - Member,
> - Or(Member.user_id == self._user.id,
> - And(Address.user_id == self._user.id,
> - Member.address_id == Address.id)))
> - return results.config(distinct=True)
> + results = store.query(Member).filter(
> + or_(Member.user_id == self._user.id,
> + and_(Address.user_id == self._user.id,
> + Member.address_id == Address.id)))
> + return results.distinct()
>
> @property
> def member_count(self):
> @@ -297,8 +291,7 @@
> @dbconnection
> def get_member(self, store, address):
> """See `IRoster`."""
> - results = store.find(
> - Member,
> + results = store.query(Member).filter(
> Member.address_id == Address.id,
> Address.user_id == self._user.id)
> if results.count() == 0:
>
> === modified file 'src/mailman/model/tests/test_listmanager.py'
> --- src/mailman/model/tests/test_listmanager.py 2014-04-14 16:14:13 +0000
> +++ src/mailman/model/tests/test_listmanager.py 2014-10-14 01:13:52 +0000
> @@ -29,11 +29,11 @@
>
> import unittest
>
> -from storm.locals import Store
> from zope.component import getUtility
>
> from mailman.app.lifecycle import create_list
> from mailman.app.moderator import hold_message
> +from mailman.config import config
> from mailman.interfaces.listmanager import (
> IListManager, ListCreatedEvent, ListCreatingEvent, ListDeletedEvent,
> ListDeletingEvent)
> @@ -80,6 +80,15 @@
> self.assertTrue(isinstance(self._events[1], ListDeletedEvent))
> self.assertEqual(self._events[1].fqdn_listname, 'another(a)example.com')
>
> + def test_list_manager_list_ids(self):
> + # You can get all the list ids for all the existing mailing lists.
> + create_list('ant(a)example.com')
> + create_list('bee(a)example.com')
> + create_list('cat(a)example.com')
> + self.assertEqual(
> + sorted(getUtility(IListManager).list_ids),
> + ['ant.example.com', 'bee.example.com', 'cat.example.com'])
> +
>
>
> class TestListLifecycleEvents(unittest.TestCase):
> @@ -139,9 +148,8 @@
> for name in filter_names:
> setattr(self._ant, name, ['test-filter-1', 'test-filter-2'])
> getUtility(IListManager).delete(self._ant)
> - store = Store.of(self._ant)
> - filters = store.find(ContentFilter,
> - ContentFilter.mailing_list == self._ant)
> + filters = config.db.store.query(ContentFilter).filter_by(
> + mailing_list = self._ant)
> self.assertEqual(filters.count(), 0)
>
>
>
> === modified file 'src/mailman/model/tests/test_requests.py'
> --- src/mailman/model/tests/test_requests.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/model/tests/test_requests.py 2014-10-14 01:13:52 +0000
> @@ -70,10 +70,10 @@
> # Calling hold_request() with a bogus request type is an error.
> with self.assertRaises(TypeError) as cm:
> self._requests_db.hold_request(5, 'foo')
> - self.assertEqual(cm.exception.message, 5)
> + self.assertEqual(cm.exception.args[0], 5)
>
> def test_delete_missing_request(self):
> # Trying to delete a missing request is an error.
> with self.assertRaises(KeyError) as cm:
> self._requests_db.delete_request(801)
> - self.assertEqual(cm.exception.message, 801)
> + self.assertEqual(cm.exception.args[0], 801)
>
> === modified file 'src/mailman/model/uid.py'
> --- src/mailman/model/uid.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/model/uid.py 2014-10-14 01:13:52 +0000
> @@ -25,11 +25,12 @@
> ]
>
>
> -from storm.locals import Int
> -from storm.properties import UUID
> +
> +from sqlalchemy import Column, Integer
>
> from mailman.database.model import Model
> from mailman.database.transaction import dbconnection
> +from mailman.database.types import UUID
>
>
>
> @@ -45,8 +46,11 @@
> There is no interface for this class, because it's purely an internal
> implementation detail.
> """
> - id = Int(primary=True)
> - uid = UUID()
> +
> + __tablename__ = 'uid'
> +
> + id = Column(Integer, primary_key=True)
> + uid = Column(UUID, index=True)
>
> @dbconnection
> def __init__(self, store, uid):
> @@ -70,7 +74,7 @@
> :type uid: unicode
> :raises ValueError: if the id is not unique.
> """
> - existing = store.find(UID, uid=uid)
> + existing = store.query(UID).filter_by(uid=uid)
> if existing.count() != 0:
> raise ValueError(uid)
> return UID(uid)
>
> === modified file 'src/mailman/model/user.py'
> --- src/mailman/model/user.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/model/user.py 2014-10-14 01:13:52 +0000
> @@ -24,14 +24,15 @@
> 'User',
> ]
>
> -from storm.locals import (
> - DateTime, Int, RawStr, Reference, ReferenceSet, Unicode)
> -from storm.properties import UUID
> +from sqlalchemy import (
> + Column, DateTime, ForeignKey, Integer, LargeBinary, Unicode)
> +from sqlalchemy.orm import relationship, backref
> from zope.event import notify
> from zope.interface import implementer
>
> from mailman.database.model import Model
> from mailman.database.transaction import dbconnection
> +from mailman.database.types import UUID
> from mailman.interfaces.address import (
> AddressAlreadyLinkedError, AddressNotLinkedError)
> from mailman.interfaces.user import (
> @@ -51,24 +52,38 @@
> class User(Model):
> """Mailman users."""
>
> - id = Int(primary=True)
> - display_name = Unicode()
> - _password = RawStr(name='password')
> - _user_id = UUID()
> - _created_on = DateTime()
> -
> - addresses = ReferenceSet(id, 'Address.user_id')
> - _preferred_address_id = Int()
> - _preferred_address = Reference(_preferred_address_id, 'Address.id')
> - preferences_id = Int()
> - preferences = Reference(preferences_id, 'Preferences.id')
> + __tablename__ = 'user'
> +
> + id = Column(Integer, primary_key=True)
> + display_name = Column(Unicode)
> + _password = Column('password', LargeBinary)
> + _user_id = Column(UUID, index=True)
> + _created_on = Column(DateTime)
> +
> + addresses = relationship(
> + 'Address', backref='user',
> + primaryjoin=(id==Address.user_id))
> +
> + _preferred_address_id = Column(
> + Integer,
> + ForeignKey('address.id', use_alter=True,
> + name='_preferred_address',
> + ondelete='SET NULL'))
> +
> + _preferred_address = relationship(
> + 'Address', primaryjoin=(_preferred_address_id==Address.id),
> + post_update=True)
> +
> + preferences_id = Column(Integer, ForeignKey('preferences.id'), index=True)
> + preferences = relationship(
> + 'Preferences', backref=backref('user', uselist=False))
>
> @dbconnection
> def __init__(self, store, display_name=None, preferences=None):
> super(User, self).__init__()
> self._created_on = date_factory.now()
> user_id = uid_factory.new_uid()
> - assert store.find(User, _user_id=user_id).count() == 0, (
> + assert store.query(User).filter_by(_user_id=user_id).count() == 0, (
> 'Duplicate user id {0}'.format(user_id))
> self._user_id = user_id
> self.display_name = ('' if display_name is None else display_name)
> @@ -138,7 +153,7 @@
> @dbconnection
> def controls(self, store, email):
> """See `IUser`."""
> - found = store.find(Address, email=email)
> + found = store.query(Address).filter_by(email=email)
> if found.count() == 0:
> return False
> assert found.count() == 1, 'Unexpected count'
> @@ -148,7 +163,7 @@
> def register(self, store, email, display_name=None):
> """See `IUser`."""
> # First, see if the address already exists
> - address = store.find(Address, email=email).one()
> + address = store.query(Address).filter_by(email=email).first()
> if address is None:
> if display_name is None:
> display_name = ''
>
> === modified file 'src/mailman/model/usermanager.py'
> --- src/mailman/model/usermanager.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/model/usermanager.py 2014-10-14 01:13:52 +0000
> @@ -52,12 +52,12 @@
> @dbconnection
> def delete_user(self, store, user):
> """See `IUserManager`."""
> - store.remove(user)
> + store.delete(user)
>
> @dbconnection
> def get_user(self, store, email):
> """See `IUserManager`."""
> - addresses = store.find(Address, email=email.lower())
> + addresses = store.query(Address).filter_by(email=email.lower())
> if addresses.count() == 0:
> return None
> return addresses.one().user
> @@ -65,7 +65,7 @@
> @dbconnection
> def get_user_by_id(self, store, user_id):
> """See `IUserManager`."""
> - users = store.find(User, _user_id=user_id)
> + users = store.query(User).filter_by(_user_id=user_id)
> if users.count() == 0:
> return None
> return users.one()
> @@ -74,13 +74,13 @@
> @dbconnection
> def users(self, store):
> """See `IUserManager`."""
> - for user in store.find(User):
> + for user in store.query(User).all():
> yield user
>
> @dbconnection
> def create_address(self, store, email, display_name=None):
> """See `IUserManager`."""
> - addresses = store.find(Address, email=email.lower())
> + addresses = store.query(Address).filter(Address.email==email.lower())
> if addresses.count() == 1:
> found = addresses[0]
> raise ExistingAddressError(found.original_email)
> @@ -101,12 +101,12 @@
> # unlinked before the address can be deleted.
> if address.user:
> address.user.unlink(address)
> - store.remove(address)
> + store.delete(address)
>
> @dbconnection
> def get_address(self, store, email):
> """See `IUserManager`."""
> - addresses = store.find(Address, email=email.lower())
> + addresses = store.query(Address).filter_by(email=email.lower())
> if addresses.count() == 0:
> return None
> return addresses.one()
> @@ -115,12 +115,12 @@
> @dbconnection
> def addresses(self, store):
> """See `IUserManager`."""
> - for address in store.find(Address):
> + for address in store.query(Address).all():
> yield address
>
> @property
> @dbconnection
> def members(self, store):
> """See `IUserManager."""
> - for member in store.find(Member):
> + for member in store.query(Member).all():
> yield member
>
> === removed file 'src/mailman/model/version.py'
> --- src/mailman/model/version.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/model/version.py 1970-01-01 00:00:00 +0000
> @@ -1,44 +0,0 @@
> -# Copyright (C) 2007-2014 by the Free Software Foundation, Inc.
> -#
> -# This file is part of GNU Mailman.
> -#
> -# GNU Mailman is free software: you can redistribute it and/or modify it under
> -# the terms of the GNU General Public License as published by the Free
> -# Software Foundation, either version 3 of the License, or (at your option)
> -# any later version.
> -#
> -# GNU Mailman is distributed in the hope that it will be useful, but WITHOUT
> -# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
> -# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
> -# more details.
> -#
> -# You should have received a copy of the GNU General Public License along with
> -# GNU Mailman. If not, see <http://www.gnu.org/licenses/>.
> -
> -"""Model class for version numbers."""
> -
> -from __future__ import absolute_import, print_function, unicode_literals
> -
> -__metaclass__ = type
> -__all__ = [
> - 'Version',
> - ]
> -
> -from storm.locals import Int, Unicode
> -from mailman.database.model import Model
> -
> -
> -
> -class Version(Model):
> - id = Int(primary=True)
> - component = Unicode()
> - version = Unicode()
> -
> - # The testing machinery will generally reset all tables, however because
> - # this table tracks schema migrations, we do not want to reset it.
> - PRESERVE = True
> -
> - def __init__(self, component, version):
> - super(Version, self).__init__()
> - self.component = component
> - self.version = version
>
> === modified file 'src/mailman/rest/validator.py'
> --- src/mailman/rest/validator.py 2014-04-28 15:23:35 +0000
> +++ src/mailman/rest/validator.py 2014-10-14 01:13:52 +0000
> @@ -54,7 +54,7 @@
> return self._enum_class[enum_value]
> except KeyError as exception:
> # Retain the error message.
> - raise ValueError(exception.message)
> + raise ValueError(exception.args[0])
>
>
> def subscriber_validator(subscriber):
>
> === modified file 'src/mailman/styles/base.py'
> --- src/mailman/styles/base.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/styles/base.py 2014-10-14 01:13:52 +0000
> @@ -64,12 +64,8 @@
> mlist.info = ''
> mlist.preferred_language = 'en'
> mlist.subject_prefix = _('[$mlist.display_name] ')
> - # Set this to Never if the list's preferred language uses us-ascii,
> - # otherwise set it to As Needed.
> - if mlist.preferred_language.charset == 'us-ascii':
> - mlist.encode_ascii_prefixes = 0
> - else:
> - mlist.encode_ascii_prefixes = 2
> + mlist.encode_ascii_prefixes = (
> + mlist.preferred_language.charset != 'us-ascii')
>
>
>
>
> === modified file 'src/mailman/testing/layers.py'
> --- src/mailman/testing/layers.py 2014-01-01 14:59:42 +0000
> +++ src/mailman/testing/layers.py 2014-10-14 01:13:52 +0000
> @@ -190,6 +190,10 @@
> @classmethod
> def tearDown(cls):
> assert cls.var_dir is not None, 'Layer not set up'
> + # Reset the test database after the tests are done so that there is no
> + # data in case the tests are rerun with a database layer like mysql or
> + # postgresql which are not deleted in teardown.
> + reset_the_world()
> config.pop('test config')
> shutil.rmtree(cls.var_dir)
> cls.var_dir = None
>
> === modified file 'src/mailman/testing/testing.cfg'
> --- src/mailman/testing/testing.cfg 2014-01-01 14:59:42 +0000
> +++ src/mailman/testing/testing.cfg 2014-10-14 01:13:52 +0000
> @@ -20,7 +20,7 @@
> # For testing against PostgreSQL.
> # [database]
> # class: mailman.database.postgresql.PostgreSQLDatabase
> -# url: postgres://barry:barry@localhost/mailman
> +# url: postgresql://$USER:$USER@localhost/mailman_test
>
> [mailman]
> site_owner: noreply(a)example.com
>
> === modified file 'src/mailman/utilities/importer.py'
> --- src/mailman/utilities/importer.py 2014-04-14 16:14:13 +0000
> +++ src/mailman/utilities/importer.py 2014-10-14 01:13:52 +0000
> @@ -172,6 +172,9 @@
> personalize=Personalization,
> preferred_language=check_language_code,
> reply_goes_to_list=ReplyToMunging,
> + allow_list_posts=bool,
> + include_rfc2369_headers=bool,
> + nntp_prefix_subject_too=bool,
> )
>
>
> @@ -186,7 +189,6 @@
> filter_mime_types='filter_types',
> generic_nonmember_action='default_nonmember_action',
> include_list_post_header='allow_list_posts',
> - last_post_time='last_post_at',
> member_moderation_action='default_member_action',
> mod_password='moderator_password',
> news_moderation='newsgroup_moderation',
> @@ -198,6 +200,14 @@
> send_welcome_msg='send_welcome_message',
> )
>
> +# These DateTime fields of the mailinglist table need a type conversion to
> +# Python datetime object for SQLite databases.
> +DATETIME_COLUMNS = [
> + 'created_at',
> + 'digest_last_sent_at',
> + 'last_post_time',
> + ]
> +
> EXCLUDES = set((
> 'digest_members',
> 'members',
> @@ -217,6 +227,9 @@
> # Some attributes must not be directly imported.
> if key in EXCLUDES:
> continue
> + # These objects need explicit type conversions.
> + if key in DATETIME_COLUMNS:
> + continue
> # Some attributes from Mailman 2 were renamed in Mailman 3.
> key = NAME_MAPPINGS.get(key, key)
> # Handle the simple case where the key is an attribute of the
> @@ -238,6 +251,15 @@
> except (TypeError, KeyError):
> print('Type conversion error for key "{}": {}'.format(
> key, value), file=sys.stderr)
> + for key in DATETIME_COLUMNS:
> + try:
> + value = datetime.datetime.utcfromtimestamp(config_dict[key])
> + except KeyError:
> + continue
> + if key == 'last_post_time':
> + setattr(mlist, 'last_post_at', value)
> + continue
> + setattr(mlist, key, value)
> # Handle the archiving policy. In MM2.1 there were two boolean options
> # but only three of the four possible states were valid. Now there's just
> # an enum.
>
> === modified file 'src/mailman/utilities/modules.py'
> --- src/mailman/utilities/modules.py 2014-04-28 15:23:35 +0000
> +++ src/mailman/utilities/modules.py 2014-10-14 01:13:52 +0000
> @@ -22,6 +22,7 @@
> __metaclass__ = type
> __all__ = [
> 'call_name',
> + 'expand_path',
> 'find_components',
> 'find_name',
> 'scan_module',
> @@ -31,7 +32,7 @@
> import os
> import sys
>
> -from pkg_resources import resource_listdir
> +from pkg_resources import resource_filename, resource_listdir
>
>
>
> @@ -110,3 +111,15 @@
> continue
> for component in scan_module(module, interface):
> yield component
> +
> +
> +
> +def expand_path(url):
> + """Expand a python: path, returning the absolute file system path."""
> + # Is the context coming from a file system or Python path?
> + if url.startswith('python:'):
> + resource_path = url[7:]
> + package, dot, resource = resource_path.rpartition('.')
> + return resource_filename(package, resource + '.cfg')
> + else:
> + return url
>
--
https://code.launchpad.net/~barry/mailman/abhilash/+merge/238222
Your team Mailman Coders is requested to review the proposed merge of lp:~barry/mailman/abhilash into lp:mailman.
1
0
Oct. 14, 2014
A short first comment: in MANIFEST.in, according to the python docs, the "include" directive is not recursive, so we can't just include "*.mako" and "*.py", we have to keep the full path to the files' directories. You can check that by running sdist, the database/alembic/version/*.py files and the mako script will not be included.
--
https://code.launchpad.net/~barry/mailman/abhilash/+merge/238222
Your team Mailman Coders is requested to review the proposed merge of lp:~barry/mailman/abhilash into lp:mailman.
1
0
Barry Warsaw has proposed merging lp:~barry/mailman/abhilash into lp:mailman.
Requested reviews:
Aurélien Bompard (abompard)
Abhilash Raj (raj-abhilash1)
Mailman Coders (mailman-coders)
For more details, see:
https://code.launchpad.net/~barry/mailman/abhilash/+merge/238222
Merge branch of work primarily done by Abhilash and Aurelien for porting MM3 core from Storm to SQLAlchemy and Alembic. This is very nearly ready for merging to trunk.
--
https://code.launchpad.net/~barry/mailman/abhilash/+merge/238222
Your team Mailman Coders is requested to review the proposed merge of lp:~barry/mailman/abhilash into lp:mailman.
1
0
[Bug 1082746] [NEW] Automated processes can swamp a list with web subscription requests.
by Mark Sapiro Oct. 2, 2014
by Mark Sapiro Oct. 2, 2014
Oct. 2, 2014
Public bug reported:
There are discussions of this in threads at
<http://mail.python.org/pipermail/mailman-
users/2012-October/074213.html>, <http://mail.python.org/pipermail
/mailman-users/2012-October/074278.html> and
<http://mail.python.org/pipermail/mailman-
users/2012-November/074412.html>.
The Mailman developers do not think there is any way to prevent this
other that disabling web subscribe entirely, as by definition,
subscription requests come from unauthenticated users.
However, an attempt will be made to mitigate this by making a site
option to include a dynamically generated hidden hash in the subscribe
form which will at least require an automated process to first GET and
parse the listinfo form immediately prior to POSTing it.
** Affects: mailman
Importance: Medium
Assignee: Mark Sapiro (msapiro)
Status: In Progress
--
You received this bug notification because you are a member of Mailman
Coders, which is subscribed to GNU Mailman.
https://bugs.launchpad.net/bugs/1082746
Title:
Automated processes can swamp a list with web subscription requests.
To manage notifications about this bug go to:
https://bugs.launchpad.net/mailman/+bug/1082746/+subscriptions
3
8