Merge branch 'main' into avm99963-monorail
Merged commit 3779da353b36d43cf778e7d4f468097714dd4540
GitOrigin-RevId: 6451a5c6b75afb0fd1f37b3f14521148d0722ea8
diff --git a/.vpython3 b/.vpython3
index 909a560..9cc9d49 100644
--- a/.vpython3
+++ b/.vpython3
@@ -11,6 +11,10 @@
name: "infra/python/wheels/appengine-python-standard-py3"
version: "version:0.3.1"
>
+wheel: <
+ name: "infra/python/wheels/ezt-py2_py3"
+ version: "version:1.1"
+>
wheel: <
name: "infra/python/wheels/flask-py2_py3"
@@ -38,22 +42,47 @@
>
wheel: <
+ name: "infra/python/wheels/google-cloud-tasks-py2_py3"
+ version: "version:2.8.1"
+>
+
+wheel: <
+ name: "infra/python/wheels/httpagentparser-py2_py3"
+ version: "version:1.9.3"
+>
+
+wheel: <
name: "infra/python/wheels/httplib2-py3"
version: "version:0.19.1"
>
wheel: <
+ name: "infra/python/wheels/mysqlclient/${vpython_platform}"
+ version: "version:2.1.1"
+>
+
+wheel: <
name: "infra/python/wheels/oauth2client-py2_py3"
version: "version:4.1.3"
>
wheel: <
+ name: "infra/python/wheels/protorpc-py2_py3"
+ version: "version:0.12.0"
+>
+
+wheel: <
name: "infra/python/wheels/six-py2_py3"
version: "version:1.15.0"
>
# Required for testing only.
wheel: <
+ name: "infra/python/wheels/mox3-py2_py3"
+ version: "version:1.1.0"
+>
+
+wheel: <
name: "infra/python/wheels/pytest-py3"
version: "version:6.2.2"
>
@@ -245,6 +274,17 @@
version: "version:0.4.8"
>
+# Required by mox3==1.1.0
+wheel: <
+ name: "infra/python/wheels/pbr-py2_py3"
+ version: "version:5.9.0"
+>
+
+wheel: <
+ name: "infra/python/wheels/fixtures-py2_py3"
+ version: "version:4.0.1"
+>
+
# Required by pytest==6.2.2
wheel: <
name: "infra/python/wheels/iniconfig-py3"
diff --git a/Makefile b/Makefile
index ac8c92a..cdf9f53 100644
--- a/Makefile
+++ b/Makefile
@@ -75,6 +75,9 @@
../../cipd/protoc \
--python_out=. --prpc-python_out=. proto/*.proto
+pytest:
+ GAE_RUNTIME=python3 GAE_APPLICATION=testbed-test SERVER_SOFTWARE=test pytest
+
test:
../../test.py test appengine/monorail
@@ -123,6 +126,11 @@
run: serve
+pydeps:
+ pip install -r requirements.txt
+
+jsdeps: deps
+
deps: node_deps
rm -f static/dist/*
diff --git a/README.md b/README.md
index d4fe749..7e7c37d 100644
--- a/README.md
+++ b/README.md
@@ -21,82 +21,11 @@
*For Googlers:* Monorail's codebase is open source and can be installed locally on your workstation of choice.
-Here's how to run Monorail locally for development on MacOS and Debian stretch/buster or its derivatives.
-
-1. You need to [get the Chrome Infra depot_tools commands](https://commondatastorage.googleapis.com/chrome-infra-docs/flat/depot_tools/docs/html/depot_tools_tutorial.html#_setting_up) to check out the source code and all its related dependencies and to be able to send changes for review.
-1. Check out the Monorail source code
- 1. `cd /path/to/empty/workdir`
- 1. `fetch infra` (make sure you are not "fetch internal_infra" )
- 1. `cd infra/appengine/monorail`
-1. Make sure you have the AppEngine SDK:
- 1. It should be fetched for you by step 1 above (during runhooks)
- 1. Otherwise, you can download it from https://developers.google.com/appengine/downloads#Google_App_Engine_SDK_for_Python
- 1. Also follow https://cloud.google.com/appengine/docs/standard/python3/setting-up-environment to setup `gcloud`
-1. Install CIPD dependencies:
- 1. `gclient runhooks`
-1. Install MySQL v5.6.
- 1. On Mac, use [homebrew](https://brew.sh/) to install MySQL v5.6:
- 1. `brew install mysql@5.6`
- 1. Otherwise, download from the [official page](http://dev.mysql.com/downloads/mysql/5.6.html#downloads).
- 1. **Do not download v5.7 (as of April 2016)**
-1. Set up SQL database. (You can keep the same sharding options in settings.py that you have configured for production.).
- 1. Copy setup schema into your local MySQL service.
- 1. `mysql --user=root -e 'CREATE DATABASE monorail;'`
- 1. `mysql --user=root monorail < schema/framework.sql`
- 1. `mysql --user=root monorail < schema/project.sql`
- 1. `mysql --user=root monorail < schema/tracker.sql`
- 1. `exit`
-1. Configure the site defaults in settings.py. You can leave it as-is for now.
-1. Set up the front-end development environment:
- 1. On Debian
- 1. ``eval `../../go/env.py` `` -- you'll need to run this in any shell you
- wish to use for developing Monorail. It will add some key directories to
- your `$PATH`.
- 1. Install build requirements:
- 1. `sudo apt-get install build-essential automake`
- 1. On MacOS
- 1. [Install homebrew](https://brew.sh)
- 1. Install node and npm
- 1. Install node version manager `brew install nvm`
- 1. See the brew instructions on updating your shell's configuration
- 1. Install node and npm `nvm install 12.13.0`
- 1. Add the following to the end of your `~/.zshrc` file:
-
- export NVM_DIR="$HOME/.nvm"
- [ -s "/usr/local/opt/nvm/nvm.sh" ] && . "/usr/local/opt/nvm/nvm.sh" # This loads nvm
- [ -s "/usr/local/opt/nvm/etc/bash_completion.d/nvm" ] && . "/usr/local/opt/nvm/etc/bash_completion.d/nvm" # This loads nvm bash_completion
-
-1. Install Python and JS dependencies:
- 1. Optional: You may need to install `pip`. You can verify whether you have it installed with `which pip`.
- 1. make sure to install `pip` using `python2` instead of `python3` (use `python --version` to check the version for 2.7, `which python2` to check the path)
- 1. If you need python 2.7 for now: `sudo apt install python2.7 python2.7-dev python-is-python2`
- 1. `curl -O /tmp/get-pip.py https://bootstrap.pypa.io/pip/2.7/get-pip.py`
- 1. `sudo python /tmp/get-pip.py`
- 1. Use `virtualenv` to keep from modifying system dependencies.
- 1. `pip install virtualenv`
- 1. `python -m virtualenv venv` to set up virtualenv within your monorail directory.
- 1. `source venv/bin/activate` to activate it, needed in each terminal instance of the directory.
- 1. Mac only: install [libssl](https://github.com/PyMySQL/mysqlclient-python/issues/74), needed for mysqlclient. (do this in local env not virtual env)
- 1. `brew install openssl; export LIBRARY_PATH=$LIBRARY_PATH:/usr/local/opt/openssl/lib/`
- 1. `make dev_deps` (run in virtual env)
- 1. `make deps` (run in virtual env)
-1. Run the app:
- 1. `make serve` (run in virtual env)
- 1. Start MySQL:
- 1. Mac: `brew services restart mysql@5.6`
- 1. Linux: `mysqld`
-1. Browse the app at localhost:8080 your browser.
-1. Set up your test user account (these steps are a little odd, but just roll with it):
- 1. Sign in using `test@example.com`
- 1. Log back out and log in again as `example@example.com`
- 1. Log out and finally log in again as `test@example.com`.
- 1. Everything should work fine now.
-1. Modify your Monorail User row in the database and make that user a site admin. You will need to be a site admin to gain access to create projects through the UI.
- 1. `mysql --user=root monorail -e "UPDATE User SET is_site_admin = TRUE WHERE email = 'test@example.com';"`
- 1. If the admin change isn't immediately apparent, you may need to restart your local dev appserver. If you kill the dev server before running the docker command, the restart may not be necessary.
+For local development on Linux, see [Linux development instructions](doc/development-linux.md)
Instructions for deploying Monorail to an existing instance or setting up a new instance are [here](doc/deployment.md).
+See also: [Common Development Problems](doc/development-problems.md)
## Feature Launch Tracking
@@ -158,83 +87,6 @@
Just remember to remove them before you upload your CL.
-## Troubleshooting
-
-* `BindError: Unable to bind localhost:8080`
-
-This error occurs when port 8080 is already being used by an existing process. Oftentimes,
-this is a leftover Monorail devserver process from a past run. To quit whatever process is
-on port 8080, you can run `kill $(lsof -ti:8080)`.
-
-* `RuntimeError: maximum recursion depth exceeded while calling a Python object`
-
-If running `make serve` gives an output similar to [this](https://paste.googleplex.com/4693398234595328),
-1. make sure you're using a virtual environment (see above for how to configure one). Then, make the changes outlined in [this CL](https://chromium-review.googlesource.com/c/infra/infra/+/3152656).
-1. Also try `pip install protobuf`
-
-* `gcloud: command not found`
-
-Add the following to your `~/.zshrc` file: `alias gcloud='/Users/username/google-cloud-sdk/bin/gcloud'`. Replace `username` with your Google username.
-
-* `TypeError: connect() got an unexpected keyword argument 'charset'`
-
-This error occurs when `dev_appserver` cannot find the MySQLdb library. Try installing it via <code>sudo apt-get install python-mysqldb</code>.
-
-* `TypeError: connect() argument 6 must be string, not None`
-
-This occurs when your mysql server is not running. Check if it is running with `ps aux | grep mysqld`. Start it up with <code>/etc/init.d/mysqld start </code>on linux, or just <code>mysqld</code>.
-
-* dev_appserver says `OSError: [Errno 24] Too many open files` and then lists out all source files
-
-dev_appserver wants to reload source files that you have changed in the editor, however that feature does not seem to work well with multiple GAE modules and instances running in different processes. The workaround is to control-C or `kill` the dev_appserver processes and restart them.
-
-* `IntegrityError: (1364, "Field 'comment_id' doesn't have a default value")` happens when trying to file or update an issue
-
-In some versions of SQL, the `STRICT_TRANS_TABLES` option is set by default. You'll have to disable this option to stop this error.
-
-* `ImportError: No module named six.moves`
-
-You may not have six.moves in your virtual environment and you may need to install it.
-
-1. Determine that python and pip versions are possibly in vpython-root
- 1. `which python`
- 1. `which pip`
-1. If your python and pip are in vpython-root
- 1. ```sudo `which python` `which pip` install six```
-
-* `enum hash not match` when run `make dev_peds`
-
-You may run the app using python3 instead of python2.
-
-1. Determine the python version used in virtual environment `python --version` if it's 3.X
-
- `deactivate`
-
- `rm -r venv/`
-
- `pip uninstall virtualenv`
-
- `pip uninstall pip`
-
- in local environment `python --version` make sure to change it to python2
-
- follow previous to instructions to reinstall `pip` and `virtualenv`
-
-* `mysql_config not found` when run `make dev_deps`
-
- this may be caused installing the wrong version of six. run `pip list` in virtual env make sure it is 1.15.0
- if not
-
- `deactivate`
-
- `rm -r venv/`
-
- `pip uninstall six`
-
- `pip install six==1.15.0`
-
- `virtualenv venv`
-
# Development resources
## Supported browsers
@@ -262,7 +114,7 @@
## Release process
-See: [Monorail Deployment](doc/deployment.md)
+See: [Monorail Deployment](http://go/monorail-deploy)
# User guide
diff --git a/api/monorail_servicer.py b/api/monorail_servicer.py
index 8968ec4..7dfdf0c 100644
--- a/api/monorail_servicer.py
+++ b/api/monorail_servicer.py
@@ -9,6 +9,7 @@
import cgi
import functools
+import httplib
import logging
import sys
import time
@@ -17,12 +18,10 @@
from google.appengine.api import users
from google.protobuf import json_format
from components.prpc import codes
-from components.prpc import server
import settings
from framework import authdata
from framework import exceptions
-from framework import framework_bizobj
from framework import framework_constants
from framework import monitoring
from framework import monorailcontext
@@ -43,10 +42,27 @@
# Optional header to help prevent double updates.
REQUEST_ID_HEADER = 'x-request-id'
+# TODO(https://crbug.com/1346473)
+_PRPC_TO_HTTP_STATUS = {
+ codes.StatusCode.OK: httplib.OK,
+ codes.StatusCode.CANCELLED: httplib.NO_CONTENT,
+ codes.StatusCode.INVALID_ARGUMENT: httplib.BAD_REQUEST,
+ codes.StatusCode.DEADLINE_EXCEEDED: httplib.SERVICE_UNAVAILABLE,
+ codes.StatusCode.NOT_FOUND: httplib.NOT_FOUND,
+ codes.StatusCode.ALREADY_EXISTS: httplib.CONFLICT,
+ codes.StatusCode.PERMISSION_DENIED: httplib.FORBIDDEN,
+ codes.StatusCode.RESOURCE_EXHAUSTED: httplib.SERVICE_UNAVAILABLE,
+ codes.StatusCode.FAILED_PRECONDITION: httplib.PRECONDITION_FAILED,
+ codes.StatusCode.OUT_OF_RANGE: httplib.BAD_REQUEST,
+ codes.StatusCode.UNIMPLEMENTED: httplib.NOT_IMPLEMENTED,
+ codes.StatusCode.INTERNAL: httplib.INTERNAL_SERVER_ERROR,
+ codes.StatusCode.UNAVAILABLE: httplib.SERVICE_UNAVAILABLE,
+ codes.StatusCode.UNAUTHENTICATED: httplib.UNAUTHORIZED,
+}
def ConvertPRPCStatusToHTTPStatus(context):
"""pRPC uses internal codes 0..16, but we want to report HTTP codes."""
- return server._PRPC_TO_HTTP_STATUS.get(context._code, 500)
+ return _PRPC_TO_HTTP_STATUS.get(context._code, 500)
def PRPCMethod(func):
diff --git a/api/test/features_servicer_test.py b/api/test/features_servicer_test.py
index 177bc3f..21154de 100644
--- a/api/test/features_servicer_test.py
+++ b/api/test/features_servicer_test.py
@@ -9,7 +9,10 @@
from __future__ import absolute_import
import unittest
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
from components.prpc import codes
from components.prpc import context
diff --git a/api/test/issues_servicer_test.py b/api/test/issues_servicer_test.py
index 2c46f7c..0ba50ac 100644
--- a/api/test/issues_servicer_test.py
+++ b/api/test/issues_servicer_test.py
@@ -8,7 +8,6 @@
from __future__ import division
from __future__ import absolute_import
-import logging
import sys
import time
import unittest
@@ -18,7 +17,6 @@
from components.prpc import codes
from components.prpc import context
-from components.prpc import server
from api import issues_servicer
from api import converters
@@ -76,7 +74,7 @@
self.issues_svcr = issues_servicer.IssuesServicer(
self.services, make_rate_limiter=False)
self.prpc_context = context.ServicerContext()
- self.prpc_context.set_code(server.StatusCode.OK)
+ self.prpc_context.set_code(codes.StatusCode.OK)
self.auth = authdata.AuthData(user_id=333, email='approver3@example.com')
self.fd_1 = tracker_pb2.FieldDef(
diff --git a/api/test/monorail_servicer_test.py b/api/test/monorail_servicer_test.py
index 8c5a1d3..3c2d8a6 100644
--- a/api/test/monorail_servicer_test.py
+++ b/api/test/monorail_servicer_test.py
@@ -11,9 +11,11 @@
import time
import unittest
import mock
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
-from components.prpc import server
from components.prpc import codes
from components.prpc import context
from google.appengine.ext import testbed
diff --git a/api/test/projects_servicer_test.py b/api/test/projects_servicer_test.py
index b3084c3..683fa90 100644
--- a/api/test/projects_servicer_test.py
+++ b/api/test/projects_servicer_test.py
@@ -13,16 +13,13 @@
from components.prpc import codes
from components.prpc import context
-from components.prpc import server
from api import projects_servicer
from api.api_proto import common_pb2
from api.api_proto import issue_objects_pb2
from api.api_proto import project_objects_pb2
from api.api_proto import projects_pb2
-from framework import authdata
from framework import exceptions
-from framework import framework_constants
from framework import monorailcontext
from framework import permissions
from proto import tracker_pb2
diff --git a/api/test/sitewide_servicer_test.py b/api/test/sitewide_servicer_test.py
index 3259fbb..a15fc75 100644
--- a/api/test/sitewide_servicer_test.py
+++ b/api/test/sitewide_servicer_test.py
@@ -12,9 +12,6 @@
import unittest
import mock
-from components.prpc import codes
-from components.prpc import context
-from components.prpc import server
import settings
from api import sitewide_servicer
diff --git a/api/test/users_servicer_test.py b/api/test/users_servicer_test.py
index aa25d18..f9b6480 100644
--- a/api/test/users_servicer_test.py
+++ b/api/test/users_servicer_test.py
@@ -10,10 +10,12 @@
import unittest
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
from components.prpc import codes
from components.prpc import context
-from components.prpc import server
from api import users_servicer
from api.api_proto import common_pb2
diff --git a/api/v3/api_constants.py b/api/v3/api_constants.py
index 9752242..e21270c 100644
--- a/api/v3/api_constants.py
+++ b/api/v3/api_constants.py
@@ -9,8 +9,8 @@
# Max issues per page in the SearchIssues API.
MAX_ISSUES_PER_PAGE = 100
-# Max issues tp fetch in the BatchGetIssues API.
-MAX_BATCH_ISSUES = 100
+# Max issues to fetch in the BatchGetIssues API.
+MAX_BATCH_ISSUES = 1000
# Max issues to modify at once in the ModifyIssues API.
MAX_MODIFY_ISSUES = 100
diff --git a/api/v3/monorail_servicer.py b/api/v3/monorail_servicer.py
index d54f425..fea6e45 100644
--- a/api/v3/monorail_servicer.py
+++ b/api/v3/monorail_servicer.py
@@ -9,6 +9,7 @@
import cgi
import functools
+import httplib
import logging
import time
import sys
@@ -21,7 +22,6 @@
from google.appengine.api import app_identity
from google.protobuf import json_format
from components.prpc import codes
-from components.prpc import server
from framework import monitoring
@@ -51,10 +51,29 @@
# Domain for service account emails.
SERVICE_ACCOUNT_DOMAIN = 'gserviceaccount.com'
+# pylint: disable=pointless-string-statement
+
+# TODO(https://crbug.com/1346473)
+_PRPC_TO_HTTP_STATUS = {
+ codes.StatusCode.OK: httplib.OK,
+ codes.StatusCode.CANCELLED: httplib.NO_CONTENT,
+ codes.StatusCode.INVALID_ARGUMENT: httplib.BAD_REQUEST,
+ codes.StatusCode.DEADLINE_EXCEEDED: httplib.SERVICE_UNAVAILABLE,
+ codes.StatusCode.NOT_FOUND: httplib.NOT_FOUND,
+ codes.StatusCode.ALREADY_EXISTS: httplib.CONFLICT,
+ codes.StatusCode.PERMISSION_DENIED: httplib.FORBIDDEN,
+ codes.StatusCode.RESOURCE_EXHAUSTED: httplib.SERVICE_UNAVAILABLE,
+ codes.StatusCode.FAILED_PRECONDITION: httplib.PRECONDITION_FAILED,
+ codes.StatusCode.OUT_OF_RANGE: httplib.BAD_REQUEST,
+ codes.StatusCode.UNIMPLEMENTED: httplib.NOT_IMPLEMENTED,
+ codes.StatusCode.INTERNAL: httplib.INTERNAL_SERVER_ERROR,
+ codes.StatusCode.UNAVAILABLE: httplib.SERVICE_UNAVAILABLE,
+ codes.StatusCode.UNAUTHENTICATED: httplib.UNAUTHORIZED,
+}
def ConvertPRPCStatusToHTTPStatus(context):
"""pRPC uses internal codes 0..16, but we want to report HTTP codes."""
- return server._PRPC_TO_HTTP_STATUS.get(context._code, 500)
+ return _PRPC_TO_HTTP_STATUS.get(context._code, 500)
def PRPCMethod(func):
diff --git a/api/v3/test/hotlists_servicer_test.py b/api/v3/test/hotlists_servicer_test.py
index e9808b5..170cf2e 100644
--- a/api/v3/test/hotlists_servicer_test.py
+++ b/api/v3/test/hotlists_servicer_test.py
@@ -23,6 +23,7 @@
from framework import exceptions
from framework import monorailcontext
from framework import permissions
+from framework import sorting
from features import features_constants
from testing import fake
from services import features_svc
@@ -38,6 +39,7 @@
issue=fake.IssueService(),
project=fake.ProjectService(),
config=fake.ConfigService(),
+ cache_manager=fake.CacheManager(),
user=fake.UserService(),
usergroup=fake.UserGroupService())
self.hotlists_svcr = hotlists_servicer.HotlistsServicer(
@@ -110,6 +112,7 @@
is_private=True)
self.hotlist_resource_name = rnc.ConvertHotlistName(
self.hotlist_1.hotlist_id)
+ sorting.InitializeArtValues(self.services)
def CallWrapped(self, wrapped_handler, mc, *args, **kwargs):
self.converter = converters.Converter(mc, self.services)
diff --git a/api/v3/test/monorail_servicer_test.py b/api/v3/test/monorail_servicer_test.py
index 3569879..2abcaf9 100644
--- a/api/v3/test/monorail_servicer_test.py
+++ b/api/v3/test/monorail_servicer_test.py
@@ -11,9 +11,11 @@
import time
import unittest
import mock
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
-from components.prpc import server
from components.prpc import codes
from components.prpc import context
from google.appengine.ext import testbed
diff --git a/doc/deployment.md b/doc/deployment.md
deleted file mode 100644
index fd12add..0000000
--- a/doc/deployment.md
+++ /dev/null
@@ -1,348 +0,0 @@
-# Monorail Deployment
-
-*This document covers updating Chromium's instance of Monorail through
-Spinnaker. If you are looking to deploy your own instance of Monorail,
-see: [Creating a new Monorail instance](instance.md)*
-
-## Before your first deployment
-
-Spinnaker is a platform that helps teams configure and manage application
-deployment pipelines. We have a
-[ChromeOps Spinnaker](http://go/chrome-infra-spinnaker) instance that holds
-pipelines for several ChromeOps services, including Monorail.
-
-IMPORTANT: In the event of an unexpected failure in a Spinnaker pipeline, it is
-extremely important that the release engineer properly cleans up versions in the
-Appengine console (e.g. delete bad versions and manual rollback to previous
-version).
-
-### Spinnaker Traffic Splitting
-
-Spinnaker's traffic splitting, rollback, and cleanup systems rely heavily on the
-assumption that the currently deployed version always has the highest version
-number. During Rollbacks, Spinnaker will migrate traffic back to the second
-highest version number and delete the highest version number. So, if the
-previous deployed version was v013 and there was a v014 that was created but
-somehow never properly deleted, and Spinnaker just created a v015, which it is
-now trying to rollback, this means, spinnaker will migrate 100% traffic "back"
-to v014, which might be a bad version, and delete v015. The same could happen
-for traffic migrations. If a previous, good deployment is v013 with 100%
-traffic, and there is a bad deployment at v014 that was never cleaned up, during
-a new deployment, Spinnaker will have created v015 and begin traffic splitting
-between v014 and v015, which means our users are either being sent to the bad
-version or the new version.
-
-If you are ever unsure about how you should handle a manual cleanup and
-rollback, ping the [Monorail chat](http://chat/room/AAAACV9ZZ8k) and ask for
-help.
-
-Below are brief descriptions of all the pipelines that make up the Monorail
-deployment process in Spinnaker.
-
-#### Deploy Monorail
-
-This is the starting point of the Monorail deployment process and should be
-manually triggered by the Release Engineer.
-![start monorail deployment](md_images/start-deploy-monorail.png)
-This pipeline handles creating a Cloud Build of Monorail. The build can be created from
-HEAD of a given branch or it can re-build a previous Cloud Build given a "BUILD_ID".
-Once the build is complete, a `Deploy {Dev|Staging|Prod}` pipeline can be automatically
-triggered to deploy the build to an environment. On a regular weekly release, we should
-use the default "ENV" = dev, provide the release branch, and leave "BUILD_ID" empty.
-
-##### Parameter Options
-
-* The "BRANCH" parameter takes the name of the release branch that holds
- the commits we want to deploy.
- The input should be in the form of `refs/releases/monorail/[*deployment number*]`.
- e.g. "refs/releases/monorail/1" builds from HEAD of
- [infra/infra/+/refs/releases/monorail/1](https://chromium.googlesource.com/infra/infra/+/refs/releases/monorail/1).
-* The "ENV" parameter can be set to "dev", "staging", or "prod" to
- automatically trigger `Deploy Dev`, `Deploy Staging`, or `Deploy
- Production` (respectively) with a successful finish of `Deploy Monorail`.
- The "nodeploy" option means no new monorail version will get deployed
- to any environment.
-* The "BUILD_ID" parameter can take the id of a previous Cloud Build found
- [here](https://pantheon.corp.google.com/cloud-build/builds?organizationId=433637338589&src=ac&project=chrome-infra-spinnaker).
- We can use this to rebuild older Monorail versions. When the "BUILD_ID"
- is given "BRANCH" is ignored.
-
-#### Deploy Dev
-
-This pipeline handles deploying a new monorail-dev version and migrating traffic
-to the newest version.
-
-After a new version is created, but before traffic is migrated, there is a
-"Continue?" stage that waits on manual judgement. The release engineer is
-expected to do any testing in the newest version before confirming that the
-pipeline should continue with traffic migration. If there are any issues, the
-release engineer should select "Rollback", which triggers the `Rollback`
-pipeline. If "Continue" is selected, spinnaker will immediately migrate 100%
-traffic to to the newest version.
-![manual judgement stage](md_images/manual-judgement-stage.png)
-![continue options](md_images/continue-options.png)
-
-The successful finish of this pipeline triggers two pipelines: `Cleanup` and
-`Deploy Staging`.
-
-#### Deploy Development - EXPERIMENTAL
-
-Note that this pipeline is similar to the above `Deploy Dev` pipeline.
-This is for Prod Tech's experimental purposes. Please ignore this pipeline. This
-cannot be triggered by `Deploy Monorail`.
-
-#### Deploy Staging
-
-This pipeline handles deploying a new monorail-staging version and migrating
-traffic to the newest version.
-
-Like `Deploy Dev` after a new version is created, there is a
-"Continue?" stage that waits on manual judgement. The release engineer should
-test the new version before letting the pipeline proceed to traffic migration.
-If any issues are spotted, the release engineer should select "Rollback", to
-trigger the `Rollback` pipeline.
-
-Unlike `Deploy Dev`, after "Continue" is selected, spinnaker will
-proceed with three separate stages of traffic splitting with a waiting period
-between each traffic split.
-
-The successful finish of this pipeline triggers two pipelines: `Cleanup` and
-`Deploy Production`.
-
-#### Deploy Production
-
-This pipeline handles deploying a new monorail-prod version and migrating
-traffic to the newest version.
-
-This pipeline has the same set of stages as `Deploy Staging`. the successful
-finish of this pipeline triggers the `Cleanup` pipeline.
-
-#### Rollback
-
-This pipeline handles migrating traffic back from the newest version to the
-previous version and deleting the newest version. This pipeline is normally
-triggered by the `Rollback` stage of the `Deploy Dev|Staging|Production`
-pipelines and it only handles rolling back one of the applications, not all
-three.
-
-##### Parameter Options
-
-* "Stack" is a required parameter for this pipeline and can be one of "dev",
- "staging", or "prod". This determines which of monorail's three applications
- (monorail-dev, monorail-staging, monorail-prod) it should rollback. When
- `Rollback` is triggered by one of the above Deploy pipelines, the
- appropriate "Stack" value is passed. When the release engineer needs to
- manually trigger the `Rollback` pipeline they should make sure they are
- choosing the correct "Stack" to rollback.
- ![start rollback](md_images/start-rollback.png)
- ![rollback options](md_images/rollback-options.png)
-
-#### Cleanup
-
-This pipeline handles deleting the oldest version.
-
-For more details read [go/monorail-deployments](go/monorail-deployments) and
-[go/chrome-infra-appengine-deployments](go/chrome-infra-appengine-deployments).
-
-TODO(jojwang): Currently, notifications still need to be set up. See
-[b/138311682](https://b.corp.google.com/issues/138311682)
-
-### Notifications
-
-Monorail's pipelines in Spinnaker have been configured to send notifications to
-monorail-eng+spinnaker@google.com when:
-
-1. Any Monorail pipeline fails
-1. `Deploy Staging` requires manual judgement at the "Continue?" stage.
-1. `Deploy Production` requires manual judgement at the "Continue?" stage.
-
-## Deploying a new version to an existing instance using Spinnaker
-
-For each release cycle, a new `refs/releases/monorail/[*deployment number*]`
-branch is created at the latest [*commit SHA*] that we want to be part of the
-deployment. Spinnaker will take the [*deployment number*] and deploy from HEAD
-of the matching branch.
-
-Manual testing steps are added during Workflow's weekly meetings for each
-commit between the previous release and this release.
-
-## Spinnaker Deployment steps
-
-If any step below fails. Stop the deploy and ping
-[Monorail chat](http://chat/room/AAAACV9ZZ8k).
-
-1. Prequalify
- 1. Check for signs of trouble
- 1. [go/chops-hangout](http://go/chops-hangout)
- 1. [Viceroy](http://go/monorail-prod-viceroy)
- 1. [go/devx-pages](http://go/devx-pages)
- 1. [GAE dashboard](https://console.cloud.google.com/appengine?project=monorail-prod&duration=PT1H)
- 1. [Error Reporting](http://console.cloud.google.com/errors?time=P1D&order=COUNT_DESC&resolution=OPEN&resolution=ACKNOWLEDGED&project=monorail-prod)
- 1. If there are any significant operational problems with Monorail or ChOps
- in general, halt deploy and notify team.
-1. Assess
- 1. View the existing release branches with
- ```
- git ls-remote origin "refs/releases/monorail/*"
- ```
- Each row will show the deployment's *commit SHA* followed by the branch
- name. The value after monorail/ is the *deployment number*.
- 1. Your *deployment number* is the last deployment number + 1.
- 1. Your *commit SHA* is either from the commit you want to deploy from or
- the last commit from HEAD. To get the SHA from HEAD:
- ```
- git rev-parse HEAD
- ```
-1. Create branch
- 1. Create a new local branch at the desired [*commit SHA*]:
- ```
- git checkout -b <your_release_branch_name> [*commit SHA*]
- ```
- 1. [Optional] cherry pick another commit that is ahead of
- [*commit SHA*]:
- ```
- git cherry-pick -x [*cherry-picked commit SHA*]
- ```
- 1. Push your local branch to remote origin and tag it as
- <your_release_branch_name>:refs/releases/monorail/x, where x is your *deployment number*:
- ```
- git push origin <your_release_branch_name>:refs/releases/monorail/[*deployment number*]
- ```
- 1. If the branch already exists, [*commit SHA*] must be ahead of the
- current commit that the branch points to.
-1. Update Dev and Staging schema
- 1. Check for changes since last deploy:
- ```
- tail -30 schema/alter-table-log.txt
- ```
- If you don't see any changes since the last deploy, skip this section.
- 1. Copy and paste updates to the
- [primary DB](http://console.cloud.google.com/sql/instances/primary/overview?project=monorail-dev)
- in the `monorail-dev` project. Please be careful when pasting into SQL
- prompt.
- 1. Copy and paste the new changes into the
- [primary DB](http://console.cloud.google.com/sql/instances/primary/overview?project=monorail-staging)
- in staging.
-1. Start deployment
- 1. Navigate to the Monorail Delivery page at
- [go/spinnaker-deploy-monorail](https://spinnaker-1.endpoints.chrome-infra-spinnaker.cloud.goog/#/applications/monorail/executions)
- in Spinnaker.
- 1. Identify the `Deploy Monorail` Pipeline.
- 1. Click "Start Manual Execution".
- ![start monorail deployment](md_images/start-deploy-monorail.png)
- 1. The "BUILD_ID" field should be empty.
- 1. The "ENV" field should be set to "dev".
- 1. The "BRANCH" field should be set to
- "refs/releases/monorail/[*deployment number*]".
- 1. The notifications box can remain unchanged.
-1. Confirm monorail-dev was successfully deployed (Pipeline: `Deploy Dev`, Stage: "Continue?")
- 1. Find the new version using the
- [appengine dev version console](https://pantheon.corp.google.com/appengine/versions?organizationId=433637338589&project=monorail-dev).
- 1. Visit popular/essential pages and confirm they are all accessible.
- 1. If everything looks good, choose "Continue" for Deploy Dev.
- 1. If there is an issue, choose "Rollback" for this stage.
-1. Test on Staging (Pipeline: `Deploy Staging`, Stage: "Continue?")
- 1. Find the new version using the
- [appengine staging version console](https://pantheon.corp.google.com/appengine/versions?organizationId=433637338589&project=monorail-staging).
- 1. For each commit since last deploy, verify affected functionality still
- works.
- 1. Test using a non-admin account, unless you're verifying
- admin-specific functionality.
- 1. If you rolled back a previous attempt, make sure you test any
- changes that might have landed in the mean time.
- 1. Test that email works by updating any issue with an owner and/or cc
- list and confirming that the email shows up in
- [g/monorail-staging-emails](http://g/monorail-staging-emails) with
- all the correct recipients.
- 1. If everything looks good, choose "Continue" for Deploy Staging.
- 1. If there is an issue, choose "Rollback" for this stage.
-1. Update Prod Schema
- 1. If you made changes to the Dev and Prod schema, repeat them on the prod
- database.
-1. Test on Prod (Pipeline: `Deploy Production`, Stage: "Continue?")
- 1. Find the new version using the
- [appengine prod version console](https://pantheon.corp.google.com/appengine/versions?organizationId=433637338589&project=monorail-prod).
- 1. For each commit since last deploy, verify affected functionality still
- works. Test using a non-admin account, unless you're verifying
- admin-specific functionality.
- 1. Add a comment to an issue.
- 1. Enter a new issue and CC your personal account.
- 1. Verify that you got an email
- 1. Try doing a query that is not cached, then repeat it to test the cached
- case.
- 1. If everything looks good, choose "Continue" for Deploy Prod.
- 1. If there is an issue, choose "Rollback" for this stage.
-1. Monitor Viceroy and Error Reporting
- 1. Modest latency increases are normal in the first 10-20 minutes
- 1. Check
- [/p/chromium updates page](https://bugs.chromium.org/p/chromium/updates/list).
- 1. [Chromiumdash](https://chromiumdash.appspot.com/release-manager?platform=Android),
- should work after deployment.
-1. Announce the Deployment.
- 1. Include the [build id](https://screenshot.googleplex.com/KvzoxHEs6Qy.png) of the
- Cloud Build used for this deployment.
- 1. Include a link and name of the release branch used for the deployment.
- 1. Include list of changes that went out (obtained from section 2 above),
- or via `git log --oneline .` (use `--before` and `--after` as needed).
- 1. If there were schema changes, copy and paste the commands at the bottom
- of the email
- 1. Use the subject line:
- "Deployed Monorail to staging and prod with release branch [*deployment number*]"
- 1. Send the email to "monorail-eng@google.com" and
- "chrome-infra+monorail@google.com"
-1. Add a new row to the
- [Monorail Deployment Stats](http://go/monorail-deployment-stats) spreadsheet
- to help track deploys/followups/rollbacks. It is important to do this even
- if the deploy failed for some reason.
-
-### Rolling back and other unexpected situations.
-
-If issues are discovered after the "Continue?" stage and traffic migration has
-already started: Cancel the execution and manually start the `Rollback`
-pipeline. ![cancel executions](md_images/cancel-execution.png)
-
-If issues are discovered during the monorail-staging or monorail-prod deployment
-DO NOT forget to also run the `Rollback` pipeline for monorail-dev or
-monorail-dev and monorail-staging, respectively.
-
-If you are ever unsure on how to rollback or clean up unexpected Spinnaker
-errors please ping the [Monorail chat](http://chat/room/AAAACV9ZZ8k) for help.
-
-## Manually Deploying and Rolling back if Spinnaker is down.
-
-### Creating a new app version
-1. From infra/monorail, create a new local branch at the desired [*commit SHA*]. Ensure that the branch has no unmerged changes.
- ```
- git checkout -b <your_release_branch_name> [*commit SHA*]
- ```
- 1. [Optional] cherry pick another commit that is ahead of
- [*commit SHA*]:
- ```
- git cherry-pick -x [*cherry-picked commit SHA*]
- ```
-1. run
- ```
- eval `../../go/env.py`
- ```
-1. Create a new version with gae.py (replacing `deploy_dev` with `deploy_staging` or `deploy_prod`, if appropriate):
- ```
- make deploy_dev
- ```
- 1. [Optional] If you encounter `ImportError: No module named six.moves`, try again after running:
- [*commit SHA*]:
- ```
- sudo `which python` `which pip` install six
- ```
-1. The new version should show up in pantheon's App Engine's Versions [page](https://pantheon.corp.google.com/appengine/versions?src=ac&project=monorail-dev&serviceId=default). Traffic allocation should be 0%.
-
-### Migrating traffic to a previous or new version
-1. Confirm the new version you want to release or the old version you want to roll back to exists in [pantheon](https://pantheon.corp.google.com/appengine/versions?src=ac&project=monorail-dev&serviceId=api):
- 1. Confirm this for all services (default, besearch, latency-insensitive, api) via the Service dropdown.
- ![services-dropdown](md_images/pantheon-services-dropdown.png)
-1. Select the desired version and click "Migrate Traffic". REPEAT THIS FOR EVERY SERVICE.
- ![migrate-traffic](md_images/pantheon-migrate-traffic.png)
-
-
-## Creating and deploying a new Monorail instance
-
-See: [Creating a new Monorail instance](instance.md)
diff --git a/doc/development-linux.md b/doc/development-linux.md
new file mode 100644
index 0000000..2c18159
--- /dev/null
+++ b/doc/development-linux.md
@@ -0,0 +1,73 @@
+# Getting started with Monorail development on Linux
+
+These steps should only need to be taken once when setting up a new development
+machine:
+
+1. Install basic build dependencies:
+ 1. `sudo apt-get install git build-essential automake`
+ 1. [Install Chrome depot_tools](https://commondatastorage.googleapis.com/chrome-infra-docs/flat/depot_tools/docs/html/depot_tools_tutorial.html#_setting_up) and ensure it is in your `$PATH`.
+
+1. Check out the Monorail source code repository:
+ 1. `mkdir ~/src/`
+ 1. `cd ~/src/`
+ 1. `fetch infra` (Googlers may alternatively `fetch infra_internal`, but note that `gclient sync` will then always sync based on the current `infra_internal` branch instead of the current `infra` branch.)
+ 1. `cd infra/appengine/monorail`
+
+1. Install and configure the gcloud CLI:
+ 1. It should be fetched for you by `fetch infra` above under `~/src/gcloud/bin`. Add it to your `PATH` in `.bashrc`.
+ 1. Otherwise, follow https://cloud.google.com/sdk/docs/install (Googlers on Linux may use `sudo apt-get install -y google-cloud-sdk`)
+1. Configure gcloud:
+ 1. `gcloud auth login`
+ 1. `gcloud config set project monorail-dev`
+
+1. Install and configure Docker:
+ 1. `sudo apt install docker-ce` (Googlers will need to follow instructions at go/docker)
+ 1. `sudo systemctl start docker`
+ 1. `sudo adduser $USER docker`
+ 1. Logout and back in to pick up the group change.
+ 1. `docker run hello-world` to verify installation.
+ 1. `gcloud auth configure-docker`
+
+1. Configure the MySQL database:
+ 1. `sudo apt install libmariadb-dev`
+ 1. `docker-compose -f dev-services.yml up -d mysql
+ 1. `docker cp schema/. mysql:/schema`
+ 1. `docker exec -it mysql -- mysql --user=root -e 'CREATE DATABASE monorail;'
+ 1. `docker exec -it mysql -- mysql --user=root monorail < schema/framework.sql`
+ 1. `docker exec -it mysql -- mysql --user=root monorail < schema/project.sql`
+ 1. `docker exec -it mysql -- mysql --user=root monorail < schema/tracker.sql`
+ 1. `docker exec -it mysql -- mysql --user=root monorail -e "UPDATE User SET is_site_admin = TRUE WHERE email LIKE '%@example.com';"`
+
+1. Install Monorail backend (python) prerequisites:
+ 1. `sudo apt install python2.7 python2.7-dev python-is-python2` (Googlers
+ will need to run `sudo apt-mark hold python2 python-is-python2
+ python2-dev` to prevent these from being uninstalled.)
+ 1. `curl -o /tmp/get-pip.py https://bootstrap.pypa.io/pip/2.7/get-pip.py`
+ 1. `sudo python2.7 /tmp/get-pip.py`
+ 1. `pip install virtualenv`
+ 1. `python2.7 -m virtualenv venv`
+
+1. Install Monorail frontend (node) prerequisites:
+ 1. `curl https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.1/install.sh | bash`
+ 1. `source "$HOME/.nvm/nvm.sh"`
+ 1. `nvm install 12.13.1`
+
+These steps will need to be repeated regularly when developing Monorail:
+
+1. Enter the virtual environment (in every new terminal you create):
+ 1. `eval $(../../go/env.py)`
+ 1. `source ./venv/bin/activate`
+
+1. Install per-language dependencies (every few days to ensure you're up to date):
+ 1. `make dev_deps`
+ 1. `make js_deps`
+ 1. `npm install`
+
+1. Launch the Monorail server (once per machine/per reboot):
+ 1. `make serve`
+ 1. Browse the app at localhost:8080 your browser.
+
+1. Login with test user (once per machine or when starting with a fresh database):
+ 1. Sign in using `test@example.com`
+ 1. Log back out and log in again as `example@example.com`
+ 1. Log out and finally log in again as `test@example.com`.
diff --git a/doc/development-problems.md b/doc/development-problems.md
new file mode 100644
index 0000000..75f4029
--- /dev/null
+++ b/doc/development-problems.md
@@ -0,0 +1,113 @@
+# Monorail Development Problems
+
+* `BindError: Unable to bind localhost:8080`
+
+This error occurs when port 8080 is already being used by an existing process. Oftentimes,
+this is a leftover Monorail devserver process from a past run. To quit whatever process is
+on port 8080, you can run `kill $(lsof -ti:8080)`.
+
+* `RuntimeError: maximum recursion depth exceeded while calling a Python object`
+
+If running `make serve` gives an output similar to [this](https://paste.googleplex.com/4693398234595328),
+1. make sure you're using a virtual environment (see above for how to configure one). Then, make the changes outlined in [this CL](https://chromium-review.googlesource.com/c/infra/infra/+/3152656).
+1. Also try `pip install protobuf`
+
+* `gcloud: command not found`
+
+Add the following to your `~/.zshrc` file: `alias gcloud='/Users/username/google-cloud-sdk/bin/gcloud'`. Replace `username` with your Google username.
+
+* `TypeError: connect() got an unexpected keyword argument 'charset'`
+
+This error occurs when `dev_appserver` cannot find the MySQLdb library. Try installing it via <code>sudo apt-get install python-mysqldb</code>.
+
+* `TypeError: connect() argument 6 must be string, not None`
+
+This occurs when your mysql server is not running. Check if it is running with `ps aux | grep mysqld`. Start it up with <code>/etc/init.d/mysqld start </code>on linux, or just <code>mysqld</code>.
+
+* dev_appserver says `OSError: [Errno 24] Too many open files` and then lists out all source files`
+
+dev_appserver wants to reload source files that you have changed in the editor, however that feature does not seem to work well with multiple GAE modules and instances running in different processes. The workaround is to control-C or `kill` the dev_appserver processes and restart them.
+
+* `IntegrityError: (1364, "Field 'comment_id' doesn't have a default value")` happens when trying to file or update an issue
+
+In some versions of SQL, the `STRICT_TRANS_TABLES` option is set by default. You'll have to disable this option to stop this error.
+
+* `ImportError: No module named six.moves`
+
+You may not have six.moves in your virtual environment and you may need to install it.
+
+1. Determine that python and pip versions are possibly in vpython-root
+ 1. `which python`
+ 1. `which pip`
+1. If your python and pip are in vpython-root
+ 1. ```sudo `which python` `which pip` install six```
+
+* `enum hash not match` when run make dev_deps`
+
+You may run the app using python3 instead of python2.
+
+1. Determine the python version used in virtual environment `python --version` if it's 3.X
+
+ `deactivate`
+
+ `rm -r venv/`
+
+ `pip uninstall virtualenv`
+
+ `pip uninstall pip`
+
+ in local environment `python --version` make sure to change it to python2
+
+ follow previous to instructions to reinstall `pip` and `virtualenv`
+
+* `mysql_config not found` when run `make dev_deps`
+
+ this may be caused installing the wrong version of six. run `pip list` in virtual env make sure it is 1.15.0
+ if not
+
+ `deactivate`
+
+ `rm -r venv/`
+
+ `pip uninstall six`
+
+ `pip install six==1.15.0`
+
+ `virtualenv venv`
+
+# Development resources
+
+## Supported browsers
+
+Monorail supports all browsers defined in the [Chrome Ops guidelines](https://chromium.googlesource.com/infra/infra/+/main/doc/front_end.md).
+
+File a browser compatibility bug
+[here](https://bugs.chromium.org/p/monorail/issues/entry?labels=Type-Defect,Priority-Medium,BrowserCompat).
+
+## Frontend code practices
+
+See: [Monorail Frontend Code Practices](doc/code-practices/frontend.md)
+
+## Monorail's design
+
+* [Monorail Data Storage](doc/design/data-storage.md)
+* [Monorail Email Design](doc/design/emails.md)
+* [How Search Works in Monorail](doc/design/how-search-works.md)
+* [Monorail Source Code Organization](doc/design/source-code-organization.md)
+* [Monorail Testing Strategy](doc/design/testing-strategy.md)
+
+## Triage process
+
+See: [Monorail Triage Guide](doc/triage.md).
+
+## Release process
+
+See: [Monorail Deployment](doc/deployment.md)
+
+# User guide
+
+For information on how to use Monorail, see the [Monorail User Guide](doc/userguide/README.md).
+
+## Setting up a new instance of Monorail
+
+See: [Creating a new Monorail instance](doc/instance.md)
diff --git a/features/banspammer.py b/features/banspammer.py
index a6be311..fd28045 100644
--- a/features/banspammer.py
+++ b/features/banspammer.py
@@ -8,7 +8,6 @@
from __future__ import division
from __future__ import absolute_import
-import logging
import json
import time
@@ -17,10 +16,10 @@
from framework import framework_helpers
from framework import permissions
from framework import jsonfeed
-from framework import servlet
from framework import urls
-class BanSpammer(servlet.Servlet):
+
+class BanSpammer(flaskservlet.FlaskServlet):
"""Ban a user and mark their content as spam"""
def AssertBasePermission(self, mr):
@@ -57,12 +56,11 @@
mr, mr.viewed_user_auth.user_view.profile_url, include_project=False,
saved=1, ts=int(time.time()))
- # def PostBanSpammerPage(self, **kwargs):
- # return self.handler(**kwargs)
+ def PostBanSpammerPage(self, **kwargs):
+ return self.handler(**kwargs)
-# when convert to flask switch jsonfeed.FlaskInternalTask
-class BanSpammerTask(jsonfeed.InternalTask):
+class BanSpammerTask(jsonfeed.FlaskInternalTask):
"""This task will update all of the comments and issues created by the
target user with is_spam=True, and also add a manual verdict attached
to the user who originated the ban request. This is a potentially long
@@ -96,18 +94,10 @@
self.services.issue, self.services.user, comment.id,
reporter_id, is_spammer)
- # remove the self.response.body when convert to flask
- self.response.body = json.dumps({
- 'comments': len(comments),
- 'issues': len(issues),
+ return json.dumps({
+ 'comments': len(comments),
+ 'issues': len(issues),
})
- # return json.dumps({
- # 'comments': len(comments),
- # 'issues': len(issues),
- # })
- # def GetBanSpammer(self, **kwargs):
- # return self.handler(**kwargs)
-
- # def PostBanSpammer(self, **kwargs):
- # return self.handler(**kwargs)
+ def PostBanSpammer(self, **kwargs):
+ return self.handler(**kwargs)
diff --git a/features/dateaction.py b/features/dateaction.py
index 169f582..0cb3987 100644
--- a/features/dateaction.py
+++ b/features/dateaction.py
@@ -40,8 +40,7 @@
TEMPLATE_PATH = framework_constants.TEMPLATE_PATH
-# TODO: change to FlaskInternalTask when convert to Flask
-class DateActionCron(jsonfeed.InternalTask):
+class DateActionCron(jsonfeed.FlaskInternalTask):
"""Find and process issues with date-type values that arrived today."""
def HandleRequest(self, mr):
@@ -86,11 +85,8 @@
urls.ISSUE_DATE_ACTION_TASK + '.do', params)
cloud_tasks_helpers.create_task(task)
- # def GetDateActionCron(self, **kwargs):
- # return self.handler(**kwargs)
-
- # def PostDateActionCron(self, **kwargs):
- # return self.handler(**kwargs)
+ def GetDateActionCron(self, **kwargs):
+ return self.handler(**kwargs)
def _GetTimestampRange(now):
@@ -234,8 +230,5 @@
date_str = timestr.TimestampToDateWidgetStr(timestamp)
return 'The %s date has arrived: %s' % (field.field_name, date_str)
- # def GetIssueDateActionTask(self, **kwargs):
- # return self.handler(**kwargs)
-
- # def PostIssueDateActionTask(self, **kwargs):
- # return self.handler(**kwargs)
+ def PostIssueDateActionTask(self, **kwargs):
+ return self.handler(**kwargs)
diff --git a/features/filterrules.py b/features/filterrules.py
index 724d7e2..119c1b3 100644
--- a/features/filterrules.py
+++ b/features/filterrules.py
@@ -15,8 +15,7 @@
from tracker import tracker_constants
-# TODO: change to FlaskInternalTask when convert to flask
-class RecomputeDerivedFieldsTask(jsonfeed.InternalTask):
+class RecomputeDerivedFieldsTask(jsonfeed.FlaskInternalTask):
"""JSON servlet that recomputes derived fields on a batch of issues."""
def HandleRequest(self, mr):
@@ -36,15 +35,11 @@
'success': True,
}
- # def GetRecomputeDerivedFieldsTask(self, **kwargs):
- # return self.handler(**kwargs)
-
- # def PostRecomputeDerivedFieldsTask(self, **kwargs):
- # return self.handler(**kwargs)
+ def PostRecomputeDerivedFieldsTask(self, **kwargs):
+ return self.handler(**kwargs)
-# TODO: change to FlaskInternalTask when convert to Flask
-class ReindexQueueCron(jsonfeed.InternalTask):
+class ReindexQueueCron(jsonfeed.FlaskInternalTask):
"""JSON servlet that reindexes some issues each minute, as needed."""
def HandleRequest(self, mr):
@@ -57,8 +52,5 @@
'num_reindexed': num_reindexed,
}
- # def GetReindexQueueCron(self, **kwargs):
- # return self.handler(**kwargs)
-
- # def PostReindexQueueCron(self, **kwargs):
- # return self.handler(**kwargs)
+ def GetReindexQueueCron(self, **kwargs):
+ return self.handler(**kwargs)
diff --git a/features/hotlistcreate.py b/features/hotlistcreate.py
index fa8946f..6913c19 100644
--- a/features/hotlistcreate.py
+++ b/features/hotlistcreate.py
@@ -30,7 +30,7 @@
_MSG_INVALID_MEMBERS_INPUT = 'One or more editor emails is not valid.'
-class HotlistCreate(servlet.Servlet):
+class HotlistCreate(flaskservlet.FlaskServlet):
"""HotlistCreate shows a simple page with a form to create a hotlist."""
_PAGE_TEMPLATE = 'features/hotlist-create-page.ezt'
@@ -114,8 +114,8 @@
mr.cnxn, hotlist, self.services.user),
include_project=False)
- # def GetCreateHotlist(self, **kwargs):
- # return self.handler(**kwargs)
+ def GetCreateHotlist(self, **kwargs):
+ return self.handler(**kwargs)
- # def PostCreateHotlist(self, **kwargs):
- # return self.handler(**kwargs)
+ def PostCreateHotlist(self, **kwargs):
+ return self.handler(**kwargs)
diff --git a/features/inboundemail.py b/features/inboundemail.py
index d9c36d3..d9b2c37 100644
--- a/features/inboundemail.py
+++ b/features/inboundemail.py
@@ -13,13 +13,18 @@
import os
import re
import time
+import six
from six.moves import urllib
+import flask
import ezt
-from google.appengine.ext.webapp.mail_handlers import BounceNotificationHandler
+from google.appengine.api import mail
+if six.PY2:
+ from google.appengine.ext.webapp.mail_handlers import BounceNotification
+else:
+ from google.appengine.api.mail import BounceNotification
-import webapp2
import settings
from features import alert2issue
@@ -50,36 +55,37 @@
}
-class InboundEmail(webapp2.RequestHandler):
+class InboundEmail(object):
"""Servlet to handle inbound email messages."""
- def __init__(self, request, response, services=None, *args, **kwargs):
- super(InboundEmail, self).__init__(request, response, *args, **kwargs)
- self.services = services or self.app.config.get('services')
+ def __init__(self, services=None):
+ self.services = services or flask.current_app.config['services']
self._templates = {}
+ self.request = flask.request
for name, template_path in MSG_TEMPLATES.items():
self._templates[name] = template_helpers.MonorailTemplate(
TEMPLATE_PATH_BASE + template_path,
compress_whitespace=False, base_format=ezt.FORMAT_RAW)
- # def HandleInboundEmail(self, project_addr=None):
- # if self.request.method == 'POST':
- # self.post(project_addr)
- # elif self.request.method == 'GET':
- # self.get(project_addr)
+ def HandleInboundEmail(self, project_addr=None):
+ if self.request.method == 'POST':
+ self.post(project_addr)
+ elif self.request.method == 'GET':
+ self.get(project_addr)
+ return ''
def get(self, project_addr=None):
logging.info('\n\n\nGET for InboundEmail and project_addr is %r',
project_addr)
self.Handler(
- mail.InboundEmailMessage(self.request.body),
+ mail.InboundEmailMessage(self.request.get_data()),
urllib.parse.unquote(project_addr))
def post(self, project_addr=None):
logging.info('\n\n\nPOST for InboundEmail and project_addr is %r',
project_addr)
self.Handler(
- mail.InboundEmailMessage(self.request.body),
+ mail.InboundEmailMessage(self.request.get_data()),
urllib.parse.unquote(project_addr))
def Handler(self, inbound_email_message, project_addr):
@@ -293,29 +299,36 @@
BAD_EQ_RE = re.compile('=3D')
-class BouncedEmail(BounceNotificationHandler):
+class BouncedEmail(object):
"""Handler to notice when email to given user is bouncing."""
- # For docs on AppEngine's bounce email handling, see:
- # https://cloud.google.com/appengine/docs/python/mail/bounce
- # Source code is in file:
- # google_appengine/google/appengine/ext/webapp/mail_handlers.py
+ # For docs on AppEngine's bounce email see:
+ # https://cloud.google.com/appengine/docs/standard/python3/reference
+ # /services/bundled/google/appengine/api/mail/BounceNotification
- def post(self):
+ def __init__(self, services=None):
+ self.services = services or flask.current_app.config['services']
+
+ def postBouncedEmail(self):
try:
- super(BouncedEmail, self).post()
+ # Context: https://crbug.com/monorail/11083
+ bounce_message = BounceNotification(flask.request.form)
+ self.receive(bounce_message)
except AttributeError:
- # Work-around for
- # https://code.google.com/p/googleappengine/issues/detail?id=13512
- raw_message = self.request.POST.get('raw-message')
+ # Context: https://crbug.com/monorail/2105
+ raw_message = flask.request.form.get('raw-message')
logging.info('raw_message %r', raw_message)
raw_message = BAD_WRAP_RE.sub('', raw_message)
raw_message = BAD_EQ_RE.sub('=', raw_message)
logging.info('fixed raw_message %r', raw_message)
mime_message = email.message_from_string(raw_message)
logging.info('get_payload gives %r', mime_message.get_payload())
- self.request.POST['raw-message'] = mime_message
- super(BouncedEmail, self).post() # Retry with mime_message
+ new_form_dict = flask.request.form.copy()
+ new_form_dict['raw-message'] = mime_message
+ # Retry with mime_message
+ bounce_message = BounceNotification(new_form_dict)
+ self.receive(bounce_message)
+ return ''
def receive(self, bounce_message):
@@ -335,8 +348,7 @@
logging.info(
'bounce message original headers: %r', original_message.items())
- app_config = webapp2.WSGIApplication.app.config
- services = app_config['services']
+ services = self.services
cnxn = sql.MonorailConnection()
try:
@@ -346,6 +358,6 @@
services.user.UpdateUser(cnxn, user_id, user)
except exceptions.NoSuchUserException:
logging.info('User %r not found, ignoring', email_addr)
- logging.info('Received bounce post ... [%s]', self.request)
+ logging.info('Received bounce post ... [%s]', flask.request)
logging.info('Bounce original: %s', bounce_message.original)
logging.info('Bounce notification: %s', bounce_message.notification)
diff --git a/features/notify.py b/features/notify.py
index 425041e..230cbf5 100644
--- a/features/notify.py
+++ b/features/notify.py
@@ -219,11 +219,8 @@
return email_tasks
- # def GetNotifyIssueChangeTask(self, **kwargs):
- # return self.handler(**kwargs)
-
- # def PostNotifyIssueChangeTask(self, **kwargs):
- # return self.handler(**kwargs)
+ def PostNotifyIssueChangeTask(self, **kwargs):
+ return self.handler(**kwargs)
class NotifyBlockingChangeTask(notify_helpers.NotifyTaskBase):
@@ -356,11 +353,8 @@
return one_issue_email_tasks
- # def GetNotifyBlockingChangeTask(self, **kwargs):
- # return self.handler(**kwargs)
-
- # def PostNotifyBlockingChangeTask(self, **kwargs):
- # return self.handler(**kwargs)
+ def PostNotifyBlockingChangeTask(self, **kwargs):
+ return self.handler(**kwargs)
class NotifyBulkChangeTask(notify_helpers.NotifyTaskBase):
@@ -724,11 +718,8 @@
return subject, body
- # def GetNotifyBulkChangeTask(self, **kwargs):
- # return self.handler(**kwargs)
-
- # def PostNotifyBulkChangeTask(self, **kwargs):
- # return self.handler(**kwargs)
+ def PostNotifyBulkChangeTask(self, **kwargs):
+ return self.handler(**kwargs)
# For now, this class will not be used to send approval comment notifications
@@ -919,11 +910,8 @@
return list(set(recipient_ids))
- # def GetNotifyApprovalChangeTask(self, **kwargs):
- # return self.handler(**kwargs)
-
- # def PostNotifyApprovalChangeTask(self, **kwargs):
- # return self.handler(**kwargs)
+ def PostNotifyApprovalChangeTask(self, **kwargs):
+ return self.handler(**kwargs)
class NotifyRulesDeletedTask(notify_helpers.NotifyTaskBase):
@@ -991,15 +979,11 @@
return email_tasks
- # def GetNotifyRulesDeletedTask(self, **kwargs):
- # return self.handler(**kwargs)
-
- # def PostNotifyRulesDeletedTask(self, **kwargs):
- # return self.handler(**kwargs)
+ def PostNotifyRulesDeletedTask(self, **kwargs):
+ return self.handler(**kwargs)
-# TODO: change to FlaskInternalTask when convert to flask
-class OutboundEmailTask(jsonfeed.InternalTask):
+class OutboundEmailTask(jsonfeed.FlaskInternalTask):
"""JSON servlet that sends one email.
Handles tasks enqueued from notify_helpers._EnqueueOutboundEmail.
@@ -1018,9 +1002,9 @@
# To avoid urlencoding the email body, the most salient parameters to this
# method are passed as a json-encoded POST body.
try:
- email_params = json.loads(self.request.body)
+ email_params = json.loads(self.request.get_data(as_text=True))
except ValueError:
- logging.error(self.request.body)
+ logging.error(self.request.get_data(as_text=True))
raise
# If running on a GAFYD domain, you must define an app alias on the
# Application Settings admin web page.
@@ -1085,8 +1069,5 @@
sender=sender, to=to, subject=subject, body=body, html_body=html_body,
reply_to=reply_to, references=references)
- # def GetOutboundEmailTask(self, **kwargs):
- # return self.handler(**kwargs)
-
- # def PostOutboundEmailTask(self, **kwargs):
- # return self.handler(**kwargs)
+ def PostOutboundEmailTask(self, **kwargs):
+ return self.handler(**kwargs)
diff --git a/features/notify_helpers.py b/features/notify_helpers.py
index 5f77307..f22ed0e 100644
--- a/features/notify_helpers.py
+++ b/features/notify_helpers.py
@@ -123,8 +123,7 @@
return notified
-# TODO: change to FlaskInternalTask when convert to flask
-class NotifyTaskBase(jsonfeed.InternalTask):
+class NotifyTaskBase(jsonfeed.FlaskInternalTask):
"""Abstract base class for notification task handler."""
_EMAIL_TEMPLATE = None # Subclasses must override this.
diff --git a/features/pubsub.py b/features/pubsub.py
index 86bd3ba..c7c28d9 100644
--- a/features/pubsub.py
+++ b/features/pubsub.py
@@ -26,8 +26,7 @@
from framework import jsonfeed
-# TODO: change to FlaskInternalTask when convert to flask
-class PublishPubsubIssueChangeTask(jsonfeed.InternalTask):
+class PublishPubsubIssueChangeTask(jsonfeed.FlaskInternalTask):
"""JSON servlet that pushes issue update messages onto a pub/sub topic."""
def HandleRequest(self, mr):
@@ -71,11 +70,8 @@
return {}
- # def GetPublishPubsubIssueChangeTask(self, **kwargs):
- # return self.handler(**kwargs)
-
- # def PostPublishPubsubIssueChangeTask(self, **kwargs):
- # return self.handler(**kwargs)
+ def PostPublishPubsubIssueChangeTask(self, **kwargs):
+ return self.handler(**kwargs)
def set_up_pubsub_api():
diff --git a/features/test/activities_test.py b/features/test/activities_test.py
index 4eae1ab..c17eb4b 100644
--- a/features/test/activities_test.py
+++ b/features/test/activities_test.py
@@ -10,7 +10,10 @@
import unittest
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
from features import activities
from framework import framework_views
diff --git a/features/test/alert2issue_test.py b/features/test/alert2issue_test.py
index 3b1b6d1..2046b5b 100644
--- a/features/test/alert2issue_test.py
+++ b/features/test/alert2issue_test.py
@@ -11,7 +11,10 @@
import email
import unittest
from mock import patch
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
from parameterized import parameterized
from features import alert2issue
diff --git a/features/test/banspammer_test.py b/features/test/banspammer_test.py
index edf7aba..e12c506 100644
--- a/features/test/banspammer_test.py
+++ b/features/test/banspammer_test.py
@@ -10,10 +10,8 @@
import json
import mock
-import os
import unittest
from six.moves import urllib
-import webapp2
import settings
from features import banspammer
@@ -35,7 +33,7 @@
project=fake.ProjectService(),
spam=fake.SpamService(),
user=fake.UserService())
- self.servlet = banspammer.BanSpammer('req', 'res', services=self.services)
+ self.servlet = banspammer.BanSpammer(services=self.services)
@mock.patch('framework.cloud_tasks_helpers._get_client')
def testProcessFormData_noPermission(self, get_client_mock):
@@ -92,17 +90,15 @@
self.services = service_manager.Services(
issue=fake.IssueService(),
spam=fake.SpamService())
- self.res = webapp2.Response()
- self.servlet = banspammer.BanSpammerTask('req', self.res,
- services=self.services)
+ self.servlet = banspammer.BanSpammerTask(services=self.services)
def testProcessFormData_okNoIssues(self):
mr = testing_helpers.MakeMonorailRequest(
path=urls.BAN_SPAMMER_TASK + '.do', method='POST',
params={'spammer_id': 111, 'reporter_id': 222})
- self.servlet.HandleRequest(mr)
- self.assertEqual(self.res.body, json.dumps({'comments': 0, 'issues': 0}))
+ res = self.servlet.HandleRequest(mr)
+ self.assertEqual(res, json.dumps({'comments': 0, 'issues': 0}))
def testProcessFormData_okSomeIssues(self):
mr = testing_helpers.MakeMonorailRequest(
@@ -114,8 +110,8 @@
1, i, 'issue_summary', 'New', 111, project_name='project-name')
self.servlet.services.issue.TestAddIssue(issue)
- self.servlet.HandleRequest(mr)
- self.assertEqual(self.res.body, json.dumps({'comments': 0, 'issues': 10}))
+ res = self.servlet.HandleRequest(mr)
+ self.assertEqual(res, json.dumps({'comments': 0, 'issues': 10}))
def testProcessFormData_okSomeCommentsAndIssues(self):
mr = testing_helpers.MakeMonorailRequest(
@@ -137,5 +133,5 @@
comment.user_id = 111
comment.issue_id = issue.issue_id
self.servlet.services.issue.TestAddComment(comment, issue.local_id)
- self.servlet.HandleRequest(mr)
- self.assertEqual(self.res.body, json.dumps({'comments': 50, 'issues': 10}))
+ res = self.servlet.HandleRequest(mr)
+ self.assertEqual(res, json.dumps({'comments': 50, 'issues': 10}))
diff --git a/features/test/dateaction_test.py b/features/test/dateaction_test.py
index 09e5c5c..8ca5bc3 100644
--- a/features/test/dateaction_test.py
+++ b/features/test/dateaction_test.py
@@ -36,8 +36,7 @@
self.services = service_manager.Services(
user=fake.UserService(),
issue=fake.IssueService())
- self.servlet = dateaction.DateActionCron(
- 'req', 'res', services=self.services)
+ self.servlet = dateaction.DateActionCron(services=self.services)
self.TIMESTAMP_MIN = (
NOW // framework_constants.SECS_PER_DAY *
framework_constants.SECS_PER_DAY)
@@ -128,8 +127,7 @@
project=fake.ProjectService(),
config=fake.ConfigService(),
issue_star=fake.IssueStarService())
- self.servlet = dateaction.IssueDateActionTask(
- 'req', 'res', services=self.services)
+ self.servlet = dateaction.IssueDateActionTask(services=self.services)
self.config = self.services.config.GetProjectConfig('cnxn', 789)
self.config.field_defs = [
diff --git a/features/test/hotlistcreate_test.py b/features/test/hotlistcreate_test.py
index 8cf0012..e6cda4b 100644
--- a/features/test/hotlistcreate_test.py
+++ b/features/test/hotlistcreate_test.py
@@ -8,7 +8,10 @@
from __future__ import division
from __future__ import absolute_import
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
import unittest
import settings
@@ -30,8 +33,7 @@
user=fake.UserService(),
issue=fake.IssueService(),
features=fake.FeaturesService())
- self.servlet = hotlistcreate.HotlistCreate('req', 'res',
- services=self.services)
+ self.servlet = hotlistcreate.HotlistCreate(services=self.services)
self.mox = mox.Mox()
def tearDown(self):
diff --git a/features/test/hotlistdetails_test.py b/features/test/hotlistdetails_test.py
index 9a9e53f..561199c 100644
--- a/features/test/hotlistdetails_test.py
+++ b/features/test/hotlistdetails_test.py
@@ -9,7 +9,10 @@
from __future__ import absolute_import
import logging
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
import unittest
import mock
diff --git a/features/test/hotlistissues_test.py b/features/test/hotlistissues_test.py
index 49c3270..265c9d1 100644
--- a/features/test/hotlistissues_test.py
+++ b/features/test/hotlistissues_test.py
@@ -8,7 +8,10 @@
from __future__ import division
from __future__ import absolute_import
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
import mock
import unittest
import time
diff --git a/features/test/hotlistpeople_test.py b/features/test/hotlistpeople_test.py
index 74beec3..3ee7925 100644
--- a/features/test/hotlistpeople_test.py
+++ b/features/test/hotlistpeople_test.py
@@ -8,7 +8,10 @@
from __future__ import division
from __future__ import absolute_import
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
import unittest
import logging
diff --git a/features/test/inboundemail_test.py b/features/test/inboundemail_test.py
index 0eaa281..de05749 100644
--- a/features/test/inboundemail_test.py
+++ b/features/test/inboundemail_test.py
@@ -9,14 +9,14 @@
from __future__ import absolute_import
import unittest
-import webapp2
-from mock import patch
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
import time
from google.appengine.api import mail
-from google.appengine.ext.webapp.mail_handlers import BounceNotificationHandler
import settings
from businesslogic import work_env
@@ -58,8 +58,7 @@
self.msg = testing_helpers.MakeMessage(
testing_helpers.HEADER_LINES, 'awesome!')
- request, _ = testing_helpers.GetRequestObjects()
- self.inbound = inboundemail.InboundEmail(request, None, self.services)
+ self.inbound = inboundemail.InboundEmail(self.services)
self.mox = mox.Mox()
def tearDown(self):
@@ -348,10 +347,7 @@
user=fake.UserService())
self.user = self.services.user.TestAddUser('user@example.com', 111)
- app = webapp2.WSGIApplication(config={'services': self.services})
- app.set_globals(app=app)
-
- self.servlet = inboundemail.BouncedEmail()
+ self.servlet = inboundemail.BouncedEmail(self.services)
self.mox = mox.Mox()
def tearDown(self):
@@ -369,8 +365,6 @@
def testReceive_NoSuchUser(self):
"""When not found, log it and ignore without creating a user record."""
- self.servlet.request = webapp2.Request.blank(
- '/', POST={'raw-message': 'this is an email message'})
bounce_message = testing_helpers.Blank(
original={'to': 'nope@example.com'},
notification='notification')
diff --git a/features/test/notify_test.py b/features/test/notify_test.py
index 9ddcce7..e73488d 100644
--- a/features/test/notify_test.py
+++ b/features/test/notify_test.py
@@ -11,7 +11,7 @@
import json
import mock
import unittest
-import webapp2
+import flask
from google.appengine.ext import testbed
@@ -63,7 +63,13 @@
self.orig_sign_attachment_id = attachment_helpers.SignAttachmentID
attachment_helpers.SignAttachmentID = (
lambda aid: 'signed_%d' % aid)
-
+ self.servlet = notify.OutboundEmailTask(services=self.services)
+ self.app = flask.Flask('test_app')
+ self.app.config['TESTING'] = True
+ self.app.add_url_rule(
+ '/_task/outboundEmail.do',
+ view_func=self.servlet.PostOutboundEmailTask,
+ methods=['POST'])
self.testbed = testbed.Testbed()
self.testbed.activate()
self.testbed.init_memcache_stub()
@@ -89,8 +95,7 @@
result['params']['issue_ids'])
def testNotifyIssueChangeTask_Normal(self):
- task = notify.NotifyIssueChangeTask(
- request=None, response=None, services=self.services)
+ task = notify.NotifyIssueChangeTask(services=self.services)
params = {'send_email': 1, 'issue_id': 12345001, 'seq': 0,
'commenter_id': 2}
mr = testing_helpers.MakeMonorailRequest(
@@ -107,8 +112,7 @@
project_id=12345, local_id=1, owner_id=1, reporter_id=1,
is_spam=True)
self.services.issue.TestAddIssue(issue)
- task = notify.NotifyIssueChangeTask(
- request=None, response=None, services=self.services)
+ task = notify.NotifyIssueChangeTask(services=self.services)
params = {'send_email': 0, 'issue_id': issue.issue_id, 'seq': 0,
'commenter_id': 2}
mr = testing_helpers.MakeMonorailRequest(
@@ -124,8 +128,7 @@
issue2 = MakeTestIssue(
project_id=12345, local_id=2, owner_id=2, reporter_id=1)
self.services.issue.TestAddIssue(issue2)
- task = notify.NotifyBlockingChangeTask(
- request=None, response=None, services=self.services)
+ task = notify.NotifyBlockingChangeTask(services=self.services)
params = {
'send_email': 1, 'issue_id': issue2.issue_id, 'seq': 0,
'delta_blocker_iids': self.issue1.issue_id, 'commenter_id': 1,
@@ -143,8 +146,7 @@
project_id=12345, local_id=2, owner_id=2, reporter_id=1,
is_spam=True)
self.services.issue.TestAddIssue(issue2)
- task = notify.NotifyBlockingChangeTask(
- request=None, response=None, services=self.services)
+ task = notify.NotifyBlockingChangeTask(services=self.services)
params = {
'send_email': 1, 'issue_id': issue2.issue_id, 'seq': 0,
'delta_blocker_iids': self.issue1.issue_id, 'commenter_id': 1}
@@ -163,8 +165,7 @@
project_id=12345, local_id=2, owner_id=2, reporter_id=1)
issue2.cc_ids = [3]
self.services.issue.TestAddIssue(issue2)
- task = notify.NotifyBulkChangeTask(
- request=None, response=None, services=self.services)
+ task = notify.NotifyBulkChangeTask(services=self.services)
params = {
'send_email': 1, 'seq': 0,
'issue_ids': '%d,%d' % (self.issue1.issue_id, issue2.issue_id),
@@ -195,8 +196,7 @@
"""We generate email tasks for also-notify addresses."""
self.issue1.derived_notify_addrs = [
'mailing-list@example.com', 'member@example.com']
- task = notify.NotifyBulkChangeTask(
- request=None, response=None, services=self.services)
+ task = notify.NotifyBulkChangeTask(services=self.services)
params = {
'send_email': 1, 'seq': 0,
'issue_ids': '%d' % (self.issue1.issue_id),
@@ -230,8 +230,7 @@
def testNotifyBulkChangeTask_ProjectNotify(self, create_task_mock):
"""We generate email tasks for project.issue_notify_address."""
self.project.issue_notify_address = 'mailing-list@example.com'
- task = notify.NotifyBulkChangeTask(
- request=None, response=None, services=self.services)
+ task = notify.NotifyBulkChangeTask(services=self.services)
params = {
'send_email': 1, 'seq': 0,
'issue_ids': '%d' % (self.issue1.issue_id),
@@ -265,8 +264,7 @@
@mock.patch('framework.cloud_tasks_helpers.create_task')
def testNotifyBulkChangeTask_SubscriberGetsEmail(self, create_task_mock):
"""If a user subscription matches the issue, notify that user."""
- task = notify.NotifyBulkChangeTask(
- request=None, response=None, services=self.services)
+ task = notify.NotifyBulkChangeTask(services=self.services)
params = {
'send_email': 1,
'issue_ids': '%d' % (self.issue1.issue_id),
@@ -293,8 +291,7 @@
def testNotifyBulkChangeTask_CCAndSubscriberListsIssueOnce(
self, create_task_mock):
"""If a user both CCs and subscribes, include issue only once."""
- task = notify.NotifyBulkChangeTask(
- request=None, response=None, services=self.services)
+ task = notify.NotifyBulkChangeTask(services=self.services)
params = {
'send_email': 1,
'issue_ids': '%d' % (self.issue1.issue_id),
@@ -335,8 +332,7 @@
project_id=12345, local_id=2, owner_id=2, reporter_id=1,
is_spam=True)
self.services.issue.TestAddIssue(issue2)
- task = notify.NotifyBulkChangeTask(
- request=None, response=None, services=self.services)
+ task = notify.NotifyBulkChangeTask(services=self.services)
params = {
'send_email': 1,
'issue_ids': '%d,%d' % (self.issue1.issue_id, issue2.issue_id),
@@ -353,8 +349,7 @@
def testFormatBulkIssues_Normal_Single(self):
"""A user may see full notification details for all changed issues."""
self.issue1.summary = 'one summary'
- task = notify.NotifyBulkChangeTask(
- request=None, response=None, services=self.services)
+ task = notify.NotifyBulkChangeTask(services=self.services)
users_by_id = {}
commenter_view = None
config = self.services.config.GetProjectConfig('cnxn', 12345)
@@ -374,8 +369,7 @@
"""A user may see full notification details for all changed issues."""
self.issue1.summary = 'one summary'
self.issue2.summary = 'two summary'
- task = notify.NotifyBulkChangeTask(
- request=None, response=None, services=self.services)
+ task = notify.NotifyBulkChangeTask(services=self.services)
users_by_id = {}
commenter_view = None
config = self.services.config.GetProjectConfig('cnxn', 12345)
@@ -396,8 +390,7 @@
"""A user may not see full notification details for some changed issue."""
self.issue1.summary = 'one summary'
self.issue1.labels = ['Restrict-View-Google']
- task = notify.NotifyBulkChangeTask(
- request=None, response=None, services=self.services)
+ task = notify.NotifyBulkChangeTask(services=self.services)
users_by_id = {}
commenter_view = None
config = self.services.config.GetProjectConfig('cnxn', 12345)
@@ -419,8 +412,7 @@
self.issue1.summary = 'one summary'
self.issue1.labels = ['Restrict-View-Google']
self.issue2.summary = 'two summary'
- task = notify.NotifyBulkChangeTask(
- request=None, response=None, services=self.services)
+ task = notify.NotifyBulkChangeTask(services=self.services)
users_by_id = {}
commenter_view = None
config = self.services.config.GetProjectConfig('cnxn', 12345)
@@ -522,8 +514,7 @@
self.services.issue.TestAddAttachment(
attach, comment.id, approval_issue.issue_id)
- task = notify.NotifyApprovalChangeTask(
- request=None, response=None, services=self.services)
+ task = notify.NotifyApprovalChangeTask(services=self.services)
params = {
'send_email': 1,
'issue_id': approval_issue.issue_id,
@@ -556,8 +547,7 @@
project_id=12345, user_id=999, issue_id=approval_issue.issue_id,
amendments=[amend2], timestamp=1234567891, content='')
self.services.issue.TestAddComment(comment2, approval_issue.local_id)
- task = notify.NotifyApprovalChangeTask(
- request=None, response=None, services=self.services)
+ task = notify.NotifyApprovalChangeTask(services=self.services)
params = {
'send_email': 1,
'issue_id': approval_issue.issue_id,
@@ -579,8 +569,7 @@
result['notified'])
def testNotifyApprovalChangeTask_GetApprovalEmailRecipients(self):
- task = notify.NotifyApprovalChangeTask(
- request=None, response=None, services=self.services)
+ task = notify.NotifyApprovalChangeTask(services=self.services)
issue = fake.MakeTestIssue(789, 1, 'summary', 'New', 111)
approval_value = tracker_pb2.ApprovalValue(
approver_ids=[222, 333],
@@ -625,8 +614,7 @@
'proj', owner_ids=[777, 888], project_id=789)
self.services.user.TestAddUser('owner1@test.com', 777)
self.services.user.TestAddUser('cow@test.com', 888)
- task = notify.NotifyRulesDeletedTask(
- request=None, response=None, services=self.services)
+ task = notify.NotifyRulesDeletedTask(services=self.services)
params = {'project_id': 789,
'filter_rules': 'if green make yellow,if orange make blue'}
mr = testing_helpers.MakeMonorailRequest(
@@ -649,18 +637,12 @@
'reply_to': 'user@example.com',
'to': 'user@example.com',
'subject': 'Test subject'}
- body = json.dumps(params)
- request = webapp2.Request.blank('/', body=body)
- task = notify.OutboundEmailTask(
- request=request, response=None, services=self.services)
- mr = testing_helpers.MakeMonorailRequest(
- user_info={'user_id': 1},
- payload=body,
- method='POST',
- services=self.services)
- result = task.HandleRequest(mr)
- self.assertEqual(params['from_addr'], result['sender'])
- self.assertEqual(params['subject'], result['subject'])
+ data = json.dumps(params)
+ res = self.app.test_client().post('/_task/outboundEmail.do', data=data)
+ res_string = res.get_data()[5:]
+ res_json = json.loads(res_string)
+ self.assertEqual(params['from_addr'], res_json['sender'])
+ self.assertEqual(params['subject'], res_json['subject'])
def testOutboundEmailTask_MissingTo(self):
"""We skip emails that don't specify the To-line."""
@@ -668,36 +650,26 @@
'from_addr': 'requester@example.com',
'reply_to': 'user@example.com',
'subject': 'Test subject'}
- body = json.dumps(params)
- request = webapp2.Request.blank('/', body=body)
- task = notify.OutboundEmailTask(
- request=request, response=None, services=self.services)
- mr = testing_helpers.MakeMonorailRequest(
- user_info={'user_id': 1},
- payload=body,
- method='POST',
- services=self.services)
- result = task.HandleRequest(mr)
- self.assertEqual('Skipping because no "to" address found.', result['note'])
- self.assertNotIn('from_addr', result)
+ data = json.dumps(params)
+ res = self.app.test_client().post('/_task/outboundEmail.do', data=data)
+ res_string = res.get_data()[5:]
+ res_json = json.loads(res_string)
+ self.assertEqual(
+ 'Skipping because no "to" address found.', res_json['note'])
+ self.assertNotIn('from_addr', res_string)
def testOutboundEmailTask_BannedUser(self):
"""We don't send emails to banned users.."""
+ self.servlet.services.user.TestAddUser(
+ 'banned@example.com', 404, banned=True)
params = {
'from_addr': 'requester@example.com',
'reply_to': 'user@example.com',
'to': 'banned@example.com',
'subject': 'Test subject'}
- body = json.dumps(params)
- request = webapp2.Request.blank('/', body=body)
- task = notify.OutboundEmailTask(
- request=request, response=None, services=self.services)
- mr = testing_helpers.MakeMonorailRequest(
- user_info={'user_id': 1},
- payload=body,
- method='POST',
- services=self.services)
- self.services.user.TestAddUser('banned@example.com', 404, banned=True)
- result = task.HandleRequest(mr)
- self.assertEqual('Skipping because user is banned.', result['note'])
- self.assertNotIn('from_addr', result)
+ data = json.dumps(params)
+ res = self.app.test_client().post('/_task/outboundEmail.do', data=data)
+ res_string = res.get_data()[5:]
+ res_json = json.loads(res_string)
+ self.assertEqual('Skipping because user is banned.', res_json['note'])
+ self.assertNotIn('from_addr', res_string)
diff --git a/features/test/pubsub_test.py b/features/test/pubsub_test.py
index 2044cf7..e86230c 100644
--- a/features/test/pubsub_test.py
+++ b/features/test/pubsub_test.py
@@ -38,8 +38,7 @@
def testPublishPubsubIssueChangeTask_NoIssueIdParam(self):
"""Test case when issue_id param is not passed."""
- task = pubsub.PublishPubsubIssueChangeTask(
- request=None, response=None, services=self.services)
+ task = pubsub.PublishPubsubIssueChangeTask(services=self.services)
mr = testing_helpers.MakeMonorailRequest(
user_info={'user_id': 1},
params={},
@@ -54,8 +53,7 @@
def testPublishPubsubIssueChangeTask_PubSubAPIInitFailure(self):
"""Test case when pub/sub API fails to init."""
pubsub.set_up_pubsub_api = Mock(return_value=None)
- task = pubsub.PublishPubsubIssueChangeTask(
- request=None, response=None, services=self.services)
+ task = pubsub.PublishPubsubIssueChangeTask(services=self.services)
mr = testing_helpers.MakeMonorailRequest(
user_info={'user_id': 1},
params={},
@@ -69,8 +67,7 @@
def testPublishPubsubIssueChangeTask_IssueNotFound(self):
"""Test case when issue is not found."""
- task = pubsub.PublishPubsubIssueChangeTask(
- request=None, response=None, services=self.services)
+ task = pubsub.PublishPubsubIssueChangeTask(services=self.services)
mr = testing_helpers.MakeMonorailRequest(
user_info={'user_id': 1},
params={'issue_id': 314159},
@@ -87,8 +84,7 @@
issue = fake.MakeTestIssue(789, 543, 'sum', 'New', 111, issue_id=78901,
project_name='rutabaga')
self.services.issue.TestAddIssue(issue)
- task = pubsub.PublishPubsubIssueChangeTask(
- request=None, response=None, services=self.services)
+ task = pubsub.PublishPubsubIssueChangeTask(services=self.services)
mr = testing_helpers.MakeMonorailRequest(
user_info={'user_id': 1},
params={'issue_id': 78901},
diff --git a/features/test/savedqueries_helpers_test.py b/features/test/savedqueries_helpers_test.py
index d635fe1..7f5ad47 100644
--- a/features/test/savedqueries_helpers_test.py
+++ b/features/test/savedqueries_helpers_test.py
@@ -10,7 +10,10 @@
import unittest
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
from features import savedqueries_helpers
from testing import fake
diff --git a/flaskregisterpages.py b/flaskregisterpages.py
index af73a7f..4b01785 100644
--- a/flaskregisterpages.py
+++ b/flaskregisterpages.py
@@ -16,6 +16,7 @@
from features import userhotlists
from framework import banned
from framework import clientmon
+from framework import csp_report
from framework import warmup
from framework import reap
from framework import deleteusers
@@ -139,22 +140,22 @@
def RegisterGroupUrls(self, services):
flaskapp_group = flask.Flask(__name__)
_GROUP_URL = [
- # ('/', grouplist.GroupList(services=services).GetGroupList, ['GET']),
- # (
- # '/<string:viewed_username>/',
- # groupdetail.GroupDetail(services=services).GetGroupDetail,
- # ['GET']),
- # (
- # '/<string:viewed_username>/edit.do',
- # groupdetail.GroupDetail(services=services).PostGroupDetail,
- # ['POST']),
- # (
- # '/<string:viewed_username>/groupadmin',
- # groupadmin.GroupAdmin(services=services).GetGroupAdmin, ['GET']),
- # (
- # '/<string:viewed_username>/groupadmin.do',
- # groupadmin.GroupAdmin(services=services).PostGroupAdmin,
- # ['POST']),
+ (
+ '/', grouplist.FlaskGroupList(services=services).GetGroupList,
+ ['GET']),
+ (
+ '/<string:viewed_username>/',
+ groupdetail.GroupDetail(services=services).GetGroupDetail, ['GET']),
+ (
+ '/<string:viewed_username>/edit.do',
+ groupdetail.GroupDetail(services=services).PostGroupDetail,
+ ['POST']),
+ (
+ '/<string:viewed_username>/groupadmin',
+ groupadmin.GroupAdmin(services=services).GetGroupAdmin, ['GET']),
+ (
+ '/<string:viewed_username>/groupadmin.do',
+ groupadmin.GroupAdmin(services=services).PostGroupAdmin, ['POST']),
]
return self._AddFlaskUrlRules(flaskapp_group, _GROUP_URL)
@@ -163,55 +164,52 @@
def RegisterHostingUrl(self, service):
flaskapp_hosting = flask.Flask(__name__)
_HOSTING_URL = [
- # (
- # '/excessiveActivity',
- # excessiveactivity.ExcessiveActivity(
- # services=service).GetExcessiveActivity, ['GET']),
- # (
- # '/settings',
- # usersettings.UserSettings(services=service).GetUserSetting, ['GET'
- # ]),
- # (
- # '/settings.do',
- # usersettings.UserSettings(services=service).PostUserSetting,
- # ['POST']),
- # ('/noAccess', banned.Banned(services=service).GetNoAccessPage,
- # ['GET']),
- # (
- # '/moved', moved.ProjectMoved(services=service).GetProjectMoved,
- # ['GET']),
- # (
- # '/createProject',
- # projectcreate.ProjectCreate(services=service).GetCreateProject,
- # ['GET']),
- # (
- # '/createProject.do',
- # projectcreate.ProjectCreate(services=service).PostCreateProject,
- # ['POST']),
- # (
- # '/createHotlist',
- # hotlistcreate.HotlistCreate(services=service).GetCreateHotlist,
- # ['GET']),
- # (
- # '/createHotlist.do',
- # hotlistcreate.HotlistCreate(services=service).PostCreateHotlist,
- # ['POST']),
- # (
- # '/createGroup',
- # groupcreate.GroupCreate(services=service).GetGroupCreate,
- # ['GET']),
- # (
- # '/createGroup.do',
- # groupcreate.GroupCreate(services=service).PostGroupCreate,
- # ['POST']),
- # (
- # '/deleteGroup',
- # grouplist.GroupDelete(services=service).GetGroupDelete,
- # ['GET']),
- # (
- # '/deleteGroup.do',
- # grouplist.GroupDelete(services=service).PostGroupDelete,
- # ['POST']),
+ (
+ '/excessiveActivity',
+ excessiveactivity.ExcessiveActivity(
+ services=service).GetExcessiveActivity, ['GET']),
+ (
+ '/settings',
+ usersettings.UserSettings(services=service).GetUserSetting, ['GET'
+ ]),
+ (
+ '/settings.do',
+ usersettings.UserSettings(services=service).PostUserSetting,
+ ['POST']),
+ ('/noAccess', banned.Banned(services=service).GetNoAccessPage, ['GET']),
+ (
+ '/moved', moved.ProjectMoved(services=service).GetProjectMoved,
+ ['GET']),
+ (
+ '/createProject',
+ projectcreate.ProjectCreate(services=service).GetCreateProject,
+ ['GET']),
+ (
+ '/createProject.do',
+ projectcreate.ProjectCreate(services=service).PostCreateProject,
+ ['POST']),
+ (
+ '/createHotlist',
+ hotlistcreate.HotlistCreate(services=service).GetCreateHotlist,
+ ['GET']),
+ (
+ '/createHotlist.do',
+ hotlistcreate.HotlistCreate(services=service).PostCreateHotlist,
+ ['POST']),
+ (
+ '/createGroup',
+ groupcreate.GroupCreate(services=service).GetGroupCreate, ['GET']),
+ (
+ '/createGroup.do',
+ groupcreate.GroupCreate(services=service).PostGroupCreate, ['POST'
+ ]),
+ (
+ '/deleteGroup',
+ grouplist.FlaskGroupList(services=service).GetGroupDelete, ['GET']),
+ (
+ '/deleteGroup.do',
+ grouplist.FlaskGroupList(services=service).PostGroupDelete,
+ ['POST']),
]
flaskapp_hosting = self._AddFlaskUrlRules(flaskapp_hosting, _HOSTING_URL)
@@ -246,6 +244,13 @@
return flaskapp_project_redirect
+ def RegisterCspUrl(self):
+ flaskapp_csp = flask.Flask(__name__)
+ flaskapp_csp.add_url_rule(
+ '/', view_func=csp_report.postCsp, methods=['POST'])
+
+ return flaskapp_csp
+
def RegisterProjectUrls(self, service):
flaskapp_project = flask.Flask(__name__)
_PROJECT_URLS = [
@@ -668,114 +673,61 @@
def RegisterTaskUrl(self, service):
flaskapp_task = flask.Flask(__name__)
_TASK_URL = [
- # (
- # '/banSpammer',
- # banspammer.BanSpammerTask(services=service).GetBanSpammer,
- # ['GET']),
- # (
- # '/banSpammer.do',
- # banspammer.BanSpammerTask(services=service).PostBanSpammer,
- # ['POST']),
- # (
- # '/sendWipeoutUserListsTask',
- # deleteusers.SendWipeoutUserListsTask(
- # services=service).GetSendWipeoutUserListsTask, ['GET']),
- # (
- # '/sendWipeoutUserListsTask.do',
- # deleteusers.SendWipeoutUserListsTask(
- # services=service).PostSendWipeoutUserListsTask, ['POST']),
- # (
- # '/deleteWipeoutUsersTask',
- # deleteusers.DeleteWipeoutUsersTask(
- # services=service).GetDeleteWipeoutUsersTask, ['GET']),
- # (
- # '/deleteWipeoutUsersTask.do',
- # deleteusers.DeleteWipeoutUsersTask(
- # services=service).PostDeleteWipeoutUsersTask, ['POST']),
- # (
- # '/deleteUsersTask',
- # deleteusers.DeleteUsersTask(services=service).GetDeleteUsersTask,
- # ['GET']),
- # (
- # '/deleteUsersTask.do',
- # deleteusers.DeleteUsersTask(services=service).PostDeleteUsersTask,
- # ['POST']),
- # (
- # '/notifyRulesDeleted',
- # notify.NotifyRulesDeletedTask(
- # services=service).GetNotifyRulesDeletedTask, ['GET']),
- # (
- # '/notifyRulesDeleted.do',
- # notify.NotifyRulesDeletedTask(
- # services=service).PostNotifyRulesDeletedTask, ['POST']),
- # (
- # '/notifyIssueChange', notify.NotifyIssueChangeTask(
- # services=service).GetNotifyIssueChangeTask, ['GET']),
- # (
- # '/notifyIssueChange.do',
- # notify.NotifyIssueChangeTask(
- # services=service).PostNotifyIssueChangeTask, ['POST']),
- # (
- # '/notifyBlockingChange',
- # notify.NotifyBlockingChangeTask(
- # services=service).GetNotifyBlockingChangeTask, ['GET']),
- # (
- # '/notifyBlockingChange.do',
- # notify.NotifyBlockingChangeTask(
- # services=service).PostNotifyBlockingChangeTask, ['POST']),
- # (
- # '/notifyBulkEdit', notify.NotifyBulkChangeTask(
- # services=service).GetNotifyBulkChangeTask, ['GET']),
- # (
- # '/notifyBulkEdit.do', notify.NotifyBulkChangeTask(
- # services=service).PostNotifyBulkChangeTask, ['POST']),
- # (
- # '/notifyApprovalChange',
- # notify.NotifyApprovalChangeTask(
- # services=service).GetNotifyApprovalChangeTask, ['GET']),
- # (
- # '/notifyApprovalChange.do',
- # notify.NotifyApprovalChangeTask(
- # services=service).PostNotifyApprovalChangeTask, ['POST']),
- # (
- # '/publishPubsubIssueChange',
- # pubsub.PublishPubsubIssueChangeTask(
- # services=service).GetPublishPubsubIssueChangeTask, ['GET']),
- # (
- # '/publishPubsubIssueChange.do',
- # pubsub.PublishPubsubIssueChangeTask(
- # services=service).PostPublishPubsubIssueChangeTask, ['POST']),
- # (
- # '/issueDateAction', dateaction.IssueDateActionTask(
- # services=service).GetIssueDateActionTask, ['GET']),
- # (
- # '/issueDateAction.do',
- # dateaction.IssueDateActionTask(
- # services=service).PostIssueDateActionTask, ['POST']),
- # (
- # '/fltConversionTask',
- # fltconversion.FLTConvertTask(services=service).GetFLTConvertTask,
- # ['GET']),
- # (
- # '/fltConversionTask.do',
- # fltconversion.FLTConvertTask(services=service).PostFLTConvertTask,
- # ['POST']),
- # (
- # '/outboundEmail',
- # notify.OutboundEmailTask(services=service).GetOutboundEmailTask,
- # ['GET']),
- # (
- # '/outboundEmail.do',
- # notify.OutboundEmailTask(services=service).PostOutboundEmailTask,
- # ['POST']),
- # (
- # '/recomputeDerivedFields',
- # filterrules.RecomputeDerivedFieldsTask(
- # services=service).GetRecomputeDerivedFieldsTask, ['GET']),
- # (
- # '/recomputeDerivedFields.do',
- # filterrules.RecomputeDerivedFieldsTask(
- # services=service).PostRecomputeDerivedFieldsTask, ['POST']),
+ (
+ '/banSpammer.do',
+ banspammer.BanSpammerTask(services=service).PostBanSpammer,
+ ['POST']),
+ (
+ '/sendWipeoutUserListsTask.do',
+ deleteusers.SendWipeoutUserListsTask(
+ services=service).PostSendWipeoutUserListsTask, ['POST']),
+ (
+ '/deleteWipeoutUsersTask.do',
+ deleteusers.DeleteWipeoutUsersTask(
+ services=service).PostDeleteWipeoutUsersTask, ['POST']),
+ (
+ '/deleteUsersTask.do',
+ deleteusers.DeleteUsersTask(services=service).PostDeleteUsersTask,
+ ['POST']),
+ (
+ '/notifyRulesDeleted.do',
+ notify.NotifyRulesDeletedTask(
+ services=service).PostNotifyRulesDeletedTask, ['POST']),
+ (
+ '/notifyIssueChange.do',
+ notify.NotifyIssueChangeTask(
+ services=service).PostNotifyIssueChangeTask, ['POST']),
+ (
+ '/notifyBlockingChange.do',
+ notify.NotifyBlockingChangeTask(
+ services=service).PostNotifyBlockingChangeTask, ['POST']),
+ (
+ '/notifyBulkEdit.do', notify.NotifyBulkChangeTask(
+ services=service).PostNotifyBulkChangeTask, ['POST']),
+ (
+ '/notifyApprovalChange.do',
+ notify.NotifyApprovalChangeTask(
+ services=service).PostNotifyApprovalChangeTask, ['POST']),
+ (
+ '/publishPubsubIssueChange.do',
+ pubsub.PublishPubsubIssueChangeTask(
+ services=service).PostPublishPubsubIssueChangeTask, ['POST']),
+ (
+ '/issueDateAction.do',
+ dateaction.IssueDateActionTask(
+ services=service).PostIssueDateActionTask, ['POST']),
+ (
+ '/fltConversionTask.do',
+ fltconversion.FLTConvertTask(services=service).PostFLTConvertTask,
+ ['POST']),
+ (
+ '/outboundEmail.do',
+ notify.OutboundEmailTask(services=service).PostOutboundEmailTask,
+ ['POST']),
+ (
+ '/recomputeDerivedFields.do',
+ filterrules.RecomputeDerivedFieldsTask(
+ services=service).PostRecomputeDerivedFieldsTask, ['POST']),
]
for rule in _TASK_URL:
@@ -787,54 +739,30 @@
def RegisterCronUrl(self, service):
flaskapp_cron = flask.Flask(__name__)
_CRON_URL = [
- # (
- # '/wipeoutSync',
- # deleteusers.WipeoutSyncCron(services=service).GetWipeoutSyncCron,
- # ['GET']),
- # (
- # '/wipeoutSync.do',
- # deleteusers.WipeoutSyncCron(services=service).PostWipeoutSyncCron,
- # ['POST']),
- # (
- # '/reindexQueue',
- # filterrules.ReindexQueueCron(
- # services=service).GetReindexQueueCron,
- # ['GET']),
- # (
- # '/reindexQueue.do',
- # filterrules.ReindexQueueCron(
- # services=service).PostReindexQueueCron,
- # ['POST']),
- # (
- # '/dateAction',
- # dateaction.DateActionCron(services=service).GetDateActionCron,
- # ['GET']),
- # (
- # '/dateAction.do',
- # dateaction.DateActionCron(services=service).PostDateActionCron,
- # ['POST']),
- # (
- # '/ramCacheConsolidate',
- # cachemanager_svc.RamCacheConsolidate(
- # services=service).GetRamCacheConsolidate, ['GET']),
- # (
- # '/ramCacheConsolidate.do',
- # cachemanager_svc.RamCacheConsolidate(
- # services=service).PostRamCacheConsolidate, ['POST']),
- # ('/reap', reap.Reap(services=service).GetReap, ['GET']),
- # ('/reap.do', reap.Reap(services=service).PostReap, ['POST']),
- # (
- # '/loadApiClientConfigs',
- # client_config_svc.LoadApiClientConfigs().GetLoadApiClientConfigs,
- # ['GET']),
- # (
- # '/trimVisitedPages',
- # trimvisitedpages.TrimVisitedPages(
- # services=service).GetTrimVisitedPages, ['GET']),
- # (
- # '/trimVisitedPages.do',
- # trimvisitedpages.TrimVisitedPages(
- # services=service).PostTrimVisitedPages, ['POST']),
+ (
+ '/wipeoutSync',
+ deleteusers.WipeoutSyncCron(services=service).GetWipeoutSyncCron,
+ ['GET']),
+ (
+ '/reindexQueue',
+ filterrules.ReindexQueueCron(services=service).GetReindexQueueCron,
+ ['GET']),
+ (
+ '/dateAction',
+ dateaction.DateActionCron(services=service).GetDateActionCron,
+ ['GET']),
+ (
+ '/ramCacheConsolidate',
+ cachemanager_svc.RamCacheConsolidate(
+ services=service).GetRamCacheConsolidate, ['GET']),
+ ('/reap', reap.Reap(services=service).GetReap, ['GET']),
+ (
+ '/loadApiClientConfigs', client_config_svc.GetLoadApiClientConfigs,
+ ['GET']),
+ (
+ '/trimVisitedPages',
+ trimvisitedpages.TrimVisitedPages(
+ services=service).GetTrimVisitedPages, ['GET']),
]
for rule in _CRON_URL:
@@ -846,24 +774,14 @@
def RegisterBackendUrl(self, service):
flaskapp_backend = flask.Flask(__name__)
_BACKEND_URL = [
- # (
- # '/search',
- # backendsearch.BackendSearch(services=service).GetBackendSearch,
- # ['GET']),
- # (
- # '/search.do',
- # backendsearch.BackendSearch(services=service).PostBackendSearch,
- # ['POST']),
- # (
- # '/nonviewable',
- # backendnonviewable.BackendNonviewable(
- # services=service).GetBackendNonviewable,
- # ['GET']),
- # (
- # '/nonviewable.do',
- # backendnonviewable.BackendNonviewable(
- # services=service).PostBackendNonviewable,
- # ['POST']),
+ (
+ '/search',
+ backendsearch.BackendSearch(services=service).GetBackendSearch,
+ ['GET']),
+ (
+ '/nonviewable',
+ backendnonviewable.BackendNonviewable(
+ services=service).GetBackendNonviewable, ['GET']),
]
for rule in _BACKEND_URL:
@@ -875,40 +793,34 @@
def RegisterMONSetUrl(self, service):
flaskapp_mon = flask.Flask(__name__)
_MON_URL = [
- # (
- # '/clientmon',
- # clientmon.ClientMonitor(services=service).GetClientMonitor, ['GET'
- # ]),
- # (
- # '/clientmon.do',
- # clientmon.ClientMonitor(services=service).PostClientMonitor,
- # ['POST']),
- # (
- # '/jstsmon.do',
- # ts_mon_js.FlaskMonorailTSMonJSHandler(
- # services=service).PostMonorailTSMonJSHandler,
- # ['POST'],
- # )
+ (
+ '/clientmon.do',
+ clientmon.ClientMonitor(services=service).PostClientMonitor,
+ ['POST']),
+ (
+ '/jstsmon.do',
+ ts_mon_js.FlaskMonorailTSMonJSHandler(
+ services=service).PostMonorailTSMonJSHandler,
+ ['POST'],
+ )
]
flaskapp_mon = self._AddFlaskUrlRules(flaskapp_mon, _MON_URL)
return flaskapp_mon
- # pylint: disable=unused-argument
def RegisterAHUrl(self, service):
flaskapp_ah = flask.Flask(__name__)
_AH_URL = [
- # ('/warmup', warmup.Warmup(services=service).GetWarmup, ['GET']),
- # ('/start', warmup.Start(services=service).GetStart, ['GET']),
- # ('/stop', warmup.Stop(services=service).GetStop, ['GET']),
- # (
- # '/bounce',
- # inboundemail.BouncedEmail(services=service).postBouncedEmail,
- # ['POST']),
- # (
- # '/mail/<string:project_addr>',
- # inboundemail.InboundEmail(services=service).HandleInboundEmail,
- # ['GET', 'POST'])
+ ('/warmup', warmup.Warmup, ['GET']), ('/start', warmup.Start, ['GET']),
+ ('/stop', warmup.Stop, ['GET']),
+ (
+ '/bounce',
+ inboundemail.BouncedEmail(services=service).postBouncedEmail,
+ ['POST']),
+ (
+ '/mail/<string:project_addr>',
+ inboundemail.InboundEmail(services=service).HandleInboundEmail,
+ ['GET', 'POST'])
]
flaskapp_ah = self._AddFlaskUrlRules(flaskapp_ah, _AH_URL)
diff --git a/framework/banned.py b/framework/banned.py
index 231f76f..209a715 100644
--- a/framework/banned.py
+++ b/framework/banned.py
@@ -22,7 +22,7 @@
from framework import servlet
-class Banned(servlet.Servlet):
+class Banned(flaskservlet.FlaskServlet):
"""The Banned page shows a message explaining that the user is banned."""
_PAGE_TEMPLATE = 'framework/banned-page.ezt'
@@ -53,5 +53,5 @@
'currentPageURLEncoded': None,
}
- # def GetNoAccessPage(self, **kwargs):
- # return self.handler(**kwargs)
+ def GetNoAccessPage(self, **kwargs):
+ return self.handler(**kwargs)
diff --git a/framework/clientmon.py b/framework/clientmon.py
index f512f4d..fd10684 100644
--- a/framework/clientmon.py
+++ b/framework/clientmon.py
@@ -19,8 +19,7 @@
from infra_libs import ts_mon
-# TODO: convert to FlaskJsonFeed while convert to Flask
-class ClientMonitor(jsonfeed.JsonFeed):
+class ClientMonitor(jsonfeed.FlaskJsonFeed):
"""JSON feed to track client side js errors in ts_mon."""
js_errors = ts_mon.CounterMetric('frontend/js_errors',
@@ -37,9 +36,7 @@
Dict of values used by EZT for rendering the page.
"""
- # TODO: uncomment while convert to flask
- # post_data = mr.request.values
- post_data = mr.request.POST
+ post_data = mr.request.values
errors = post_data.get('errors')
try:
errors = json.loads(errors)
@@ -55,8 +52,5 @@
return {}
- # def GetClientMonitor(self, **kwargs):
- # return self.handler(**kwargs)
-
- # def PostClientMonitor(self, **kwargs):
- # return self.handler(**kwargs)
+ def PostClientMonitor(self, **kwargs):
+ return self.handler(**kwargs)
diff --git a/framework/csp_report.py b/framework/csp_report.py
index 83e3126..4b6f29e 100644
--- a/framework/csp_report.py
+++ b/framework/csp_report.py
@@ -11,12 +11,10 @@
from __future__ import division
from __future__ import absolute_import
-import webapp2
+import flask
import logging
-class CSPReportPage(webapp2.RequestHandler):
+def postCsp():
"""CSPReportPage serves CSP violation reports."""
-
- def post(self):
- logging.error('CSP Violation: %s' % self.request.body)
+ logging.error('CSP Violation: %s' % flask.request.get_data(as_text=True))
diff --git a/framework/deleteusers.py b/framework/deleteusers.py
index 739782e..015fad4 100644
--- a/framework/deleteusers.py
+++ b/framework/deleteusers.py
@@ -32,8 +32,7 @@
return credentials.authorize(httplib2.Http(timeout=60))
-# TODO: change to FlaskInternalTask when convert to Flask
-class WipeoutSyncCron(jsonfeed.InternalTask):
+class WipeoutSyncCron(jsonfeed.FlaskInternalTask):
"""Enqueue tasks for sending user lists to wipeout-lite and deleting deleted
users fetched from wipeout-lite."""
@@ -62,15 +61,11 @@
cloud_tasks_helpers.create_task(
task, queue=framework_constants.QUEUE_FETCH_WIPEOUT_DELETED_USERS)
- # def GetWipeoutSyncCron(self, **kwargs):
- # return self.handler(**kwargs)
-
- # def PostWipeoutSyncCron(self, **kwargs):
- # return self.handler(**kwargs)
+ def GetWipeoutSyncCron(self, **kwargs):
+ return self.handler(**kwargs)
-# TODO: Set to FlaskInternalTask when convert
-class SendWipeoutUserListsTask(jsonfeed.InternalTask):
+class SendWipeoutUserListsTask(jsonfeed.FlaskInternalTask):
"""Sends a batch of monorail users to wipeout-lite."""
def HandleRequest(self, mr):
@@ -95,15 +90,11 @@
logging.info(
'Received response, %s with contents, %s', resp, data)
- # def GetSendWipeoutUserListsTask(self, **kwargs):
- # return self.handler(**kwargs)
-
- # def PostSendWipeoutUserListsTask(self, **kwargs):
- # return self.handler(**kwargs)
+ def PostSendWipeoutUserListsTask(self, **kwargs):
+ return self.handler(**kwargs)
-# TODO: Set to FlaskInternalTask when convert
-class DeleteWipeoutUsersTask(jsonfeed.InternalTask):
+class DeleteWipeoutUsersTask(jsonfeed.FlaskInternalTask):
"""Fetches deleted users from wipeout-lite and enqueues tasks to delete
those users from Monorail's DB."""
@@ -137,15 +128,11 @@
'Received response, %s with contents, %s', resp, data)
return json.loads(data)
- # def GetDeleteWipeoutUsersTask(self, **kwargs):
- # return self.handler(**kwargs)
-
- # def PostDeleteWipeoutUsersTask(self, **kwargs):
- # return self.handler(**kwargs)
+ def PostDeleteWipeoutUsersTask(self, **kwargs):
+ return self.handler(**kwargs)
-# TODO: Set to FlaskInternalTask when convert
-class DeleteUsersTask(jsonfeed.InternalTask):
+class DeleteUsersTask(jsonfeed.FlaskInternalTask):
"""Deletes users from Monorail's DB."""
def HandleRequest(self, mr):
@@ -160,8 +147,5 @@
with work_env.WorkEnv(mr, self.services) as we:
we.ExpungeUsers(emails, check_perms=False)
- # def GetDeleteUsersTask(self, **kwargs):
- # return self.handler(**kwargs)
-
- # def PostDeleteUsersTask(self, **kwargs):
- # return self.handler(**kwargs)
+ def PostDeleteUsersTask(self, **kwargs):
+ return self.handler(**kwargs)
diff --git a/framework/excessiveactivity.py b/framework/excessiveactivity.py
index 0e54ebd..5506de3 100644
--- a/framework/excessiveactivity.py
+++ b/framework/excessiveactivity.py
@@ -12,18 +12,17 @@
from __future__ import division
from __future__ import absolute_import
-from framework import servlet
+from framework import flaskservlet
-class ExcessiveActivity(servlet.Servlet):
+class ExcessiveActivity(flaskservlet.FlaskServlet):
"""ExcessiveActivity page shows an error message."""
_PAGE_TEMPLATE = 'framework/excessive-activity-page.ezt'
# pylint: disable=unused-argument
def GetExcessiveActivity(self, **kwargs):
- return
- # return self.handler(**kwargs)
+ return self.handler(**kwargs)
def GatherPageData(self, _mr):
"""Build up a dictionary of data values to use when rendering the page."""
diff --git a/framework/flaskservlet.py b/framework/flaskservlet.py
index fce3eab..bc543d8 100644
--- a/framework/flaskservlet.py
+++ b/framework/flaskservlet.py
@@ -877,5 +877,5 @@
user_pb.last_visit_timestamp = now
self.services.user.UpdateUser(mr.cnxn, user_pb.user_id, user_pb)
- def abort(self, code, context):
+ def abort(self, code, context=""):
return flask.abort(code, context)
diff --git a/framework/jsonfeed.py b/framework/jsonfeed.py
index b7d85ac..1eff87d 100644
--- a/framework/jsonfeed.py
+++ b/framework/jsonfeed.py
@@ -171,7 +171,7 @@
if self.CHECK_SAME_APP and not settings.local_mode:
calling_app_id = request.headers.get('X-Appengine-Inbound-Appid')
if calling_app_id != app_identity.get_application_id():
- self.response.status = http_client.FORBIDDEN
+ self.response.status_code = http_client.FORBIDDEN
return
self._CheckForMovedProject(mr, request)
@@ -188,7 +188,7 @@
self.abort(400, msg)
except permissions.PermissionException as e:
logging.info('Trapped PermissionException %s', e)
- self.response.status = http_client.FORBIDDEN
+ self.response.status_code = http_client.FORBIDDEN
# pylint: disable=unused-argument
# pylint: disable=arguments-differ
diff --git a/framework/logger.py b/framework/logger.py
new file mode 100644
index 0000000..d2a8a0d
--- /dev/null
+++ b/framework/logger.py
@@ -0,0 +1,21 @@
+# Copyright 2022 The Chromium Authors. All rights reserved.
+# Use of this source code is governed by a BSD-style license that can be
+# found in the LICENSE file.
+""""Helper methods for structured logging."""
+
+from __future__ import print_function
+from __future__ import division
+from __future__ import absolute_import
+
+import google.cloud.logging
+
+import settings
+
+
+def log(struct):
+ if settings.local_mode or settings.unit_test_mode:
+ return
+
+ logging_client = google.cloud.logging.Client()
+ logger = logging_client.logger('python')
+ logger.log_struct(struct)
diff --git a/framework/monitoring.py b/framework/monitoring.py
index 6407e2d..08a1e23 100644
--- a/framework/monitoring.py
+++ b/framework/monitoring.py
@@ -4,13 +4,10 @@
"""Monitoring ts_mon custom to monorail."""
-import os
-import sys
-lib_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'lib')
-
-from google.cloud import logging
from infra_libs import ts_mon
+
from framework import framework_helpers
+from framework import logger
import settings
@@ -49,11 +46,9 @@
API_REQUESTS_COUNT.increment_by(1, fields)
if not settings.unit_test_mode:
- logging_client = logging.Client()
- logger = logging_client.logger("request_log")
- logger.log_struct(
+ logger.log(
{
- 'log_type': "IncrementAPIRequestsCount",
+ 'log_type': 'IncrementAPIRequestsCount',
'client_id': client_id,
'client_email': client_email,
'requests_count': str(API_REQUESTS_COUNT.get(fields)),
diff --git a/framework/reap.py b/framework/reap.py
index d0b721f..4654964 100644
--- a/framework/reap.py
+++ b/framework/reap.py
@@ -16,8 +16,7 @@
RUN_DURATION_LIMIT = 50 * 60 # 50 minutes
-# TODO: change to FlaskInternalTask when convert to Flask
-class Reap(jsonfeed.InternalTask):
+class Reap(jsonfeed.FlaskInternalTask):
"""Look for doomed and deletable projects and delete them."""
def HandleRequest(self, mr):
@@ -125,8 +124,5 @@
f(cnxn, project_id)
yield project_id
- # def GetReap(self, **kwargs):
- # return self.handler(**kwargs)
-
- # def PostReap(self, **kwargs):
- # return self.handler(**kwargs)
+ def GetReap(self, **kwargs):
+ return self.handler(**kwargs)
diff --git a/framework/servlet.py b/framework/servlet.py
index 462939a..b363095 100644
--- a/framework/servlet.py
+++ b/framework/servlet.py
@@ -304,6 +304,8 @@
browser_major_version = int(ua['browser']['version'].split('.')[0])
except ValueError:
logging.warn('Could not parse version: %r', ua['browser']['version'])
+ except KeyError:
+ logging.warn('No browser version defined in user agent.')
csp_supports_report_sample = (
(browser == 'Chrome' and browser_major_version >= 59) or
(browser == 'Opera' and browser_major_version >= 46))
diff --git a/framework/sql.py b/framework/sql.py
index 1d7573d..0fb8043 100644
--- a/framework/sql.py
+++ b/framework/sql.py
@@ -23,6 +23,7 @@
from framework import exceptions
from framework import framework_helpers
+from framework import logger
from infra_libs import ts_mon
@@ -258,12 +259,20 @@
DB_RESULT_ROWS.add(cursor.rowcount)
if stmt_str.startswith('INSERT') or stmt_str.startswith('REPLACE'):
- formatted_statement = '%s %s' % (stmt_str, stmt_args)
+ formatted_statement = ('%s %s' % (stmt_str, stmt_args)).replace('\n', ' ')
else:
- formatted_statement = stmt_str % tuple(stmt_args)
+ formatted_statement = (stmt_str % tuple(stmt_args)).replace('\n', ' ')
logging.info(
'%d rows in %d ms: %s', cursor.rowcount, int(duration),
- formatted_statement.replace('\n', ' '))
+ formatted_statement)
+ if duration >= 2000:
+ logger.log({
+ 'log_type': 'database/query',
+ 'statement': formatted_statement,
+ 'type': formatted_statement.split(' ')[0],
+ 'duration': duration / 1000,
+ 'row_count': cursor.rowcount,
+ })
if commit and not stmt_str.startswith('SELECT'):
try:
diff --git a/framework/test/banned_test.py b/framework/test/banned_test.py
index 73b9f03..0331cdd 100644
--- a/framework/test/banned_test.py
+++ b/framework/test/banned_test.py
@@ -24,16 +24,16 @@
self.services = service_manager.Services()
def testAssertBasePermission(self):
- servlet = banned.Banned('request', 'response', services=self.services)
+ servlet = banned.Banned(services=self.services)
mr = monorailrequest.MonorailRequest(self.services)
mr.auth.user_id = 0 # Anon user cannot see banned page.
- with self.assertRaises(webapp2.HTTPException) as cm:
+ with self.assertRaises(Exception) as cm:
servlet.AssertBasePermission(mr)
self.assertEqual(404, cm.exception.code)
mr.auth.user_id = 111 # User who is not banned cannot view banned page.
- with self.assertRaises(webapp2.HTTPException) as cm:
+ with self.assertRaises(Exception) as cm:
servlet.AssertBasePermission(mr)
self.assertEqual(404, cm.exception.code)
@@ -42,7 +42,7 @@
servlet.AssertBasePermission(mr)
def testGatherPageData(self):
- servlet = banned.Banned('request', 'response', services=self.services)
+ servlet = banned.Banned(services=self.services)
self.assertNotEqual(servlet.template, None)
_request, mr = testing_helpers.GetRequestObjects()
diff --git a/framework/test/deleteusers_test.py b/framework/test/deleteusers_test.py
index 2609867..87ed5bc 100644
--- a/framework/test/deleteusers_test.py
+++ b/framework/test/deleteusers_test.py
@@ -25,8 +25,7 @@
def setUp(self):
self.services = service_manager.Services(user=fake.UserService())
- self.task = deleteusers.WipeoutSyncCron(
- request=None, response=None, services=self.services)
+ self.task = deleteusers.WipeoutSyncCron(services=self.services)
self.user_1 = self.services.user.TestAddUser('user1@example.com', 111)
self.user_2 = self.services.user.TestAddUser('user2@example.com', 222)
self.user_3 = self.services.user.TestAddUser('user3@example.com', 333)
@@ -100,8 +99,7 @@
def setUp(self):
self.services = service_manager.Services(user=fake.UserService())
- self.task = deleteusers.SendWipeoutUserListsTask(
- request=None, response=None, services=self.services)
+ self.task = deleteusers.SendWipeoutUserListsTask(services=self.services)
self.task.sendUserLists = mock.Mock()
deleteusers.authorize = mock.Mock(return_value='service')
self.user_1 = self.services.user.TestAddUser('user1@example.com', 111)
@@ -143,8 +141,7 @@
def setUp(self):
self.services = service_manager.Services()
deleteusers.authorize = mock.Mock(return_value='service')
- self.task = deleteusers.DeleteWipeoutUsersTask(
- request=None, response=None, services=self.services)
+ self.task = deleteusers.DeleteWipeoutUsersTask(services=self.services)
deleted_users = [
{'id': 'user1@gmail.com'}, {'id': 'user2@gmail.com'},
{'id': 'user3@gmail.com'}, {'id': 'user4@gmail.com'}]
diff --git a/framework/test/framework_helpers_test.py b/framework/test/framework_helpers_test.py
index 1d0146c..fb8810b 100644
--- a/framework/test/framework_helpers_test.py
+++ b/framework/test/framework_helpers_test.py
@@ -11,7 +11,10 @@
import mock
import unittest
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
import time
from businesslogic import work_env
diff --git a/framework/test/monorailcontext_test.py b/framework/test/monorailcontext_test.py
index ed93920..2071c9e 100644
--- a/framework/test/monorailcontext_test.py
+++ b/framework/test/monorailcontext_test.py
@@ -10,7 +10,10 @@
import unittest
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
from framework import authdata
from framework import monorailcontext
diff --git a/framework/test/monorailrequest_test.py b/framework/test/monorailrequest_test.py
index fcd30c3..ef52f1e 100644
--- a/framework/test/monorailrequest_test.py
+++ b/framework/test/monorailrequest_test.py
@@ -13,7 +13,10 @@
import re
import unittest
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
import six
from google.appengine.api import oauth
diff --git a/framework/test/permissions_test.py b/framework/test/permissions_test.py
index 0917b53..cd67c6c 100644
--- a/framework/test/permissions_test.py
+++ b/framework/test/permissions_test.py
@@ -11,7 +11,10 @@
import time
import unittest
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
import settings
from framework import authdata
diff --git a/framework/test/ratelimiter_test.py b/framework/test/ratelimiter_test.py
index b351f8c..84230e8 100644
--- a/framework/test/ratelimiter_test.py
+++ b/framework/test/ratelimiter_test.py
@@ -15,7 +15,10 @@
from google.appengine.api import memcache
from google.appengine.ext import testbed
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
import os
import settings
diff --git a/framework/test/reap_test.py b/framework/test/reap_test.py
index f1a907d..92d17fb 100644
--- a/framework/test/reap_test.py
+++ b/framework/test/reap_test.py
@@ -11,7 +11,10 @@
import unittest
import mock
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
from mock import Mock
@@ -69,7 +72,7 @@
def testMarkDoomedProjects(self):
self.setUpMarkDoomedProjects()
- reaper = reap.Reap('req', 'resp', services=self.services)
+ reaper = reap.Reap(services=self.services)
self.mox.ReplayAll()
doomed_project_ids = reaper._MarkDoomedProjects(self.cnxn)
@@ -92,7 +95,7 @@
def testExpungeDeletableProjects(self):
self.setUpExpungeParts()
- reaper = reap.Reap('req', 'resp', services=self.services)
+ reaper = reap.Reap(services=self.services)
self.mox.ReplayAll()
expunged_project_ids = reaper._ExpungeDeletableProjects(self.cnxn)
diff --git a/framework/test/sorting_test.py b/framework/test/sorting_test.py
index 4b1feb3..4251308 100644
--- a/framework/test/sorting_test.py
+++ b/framework/test/sorting_test.py
@@ -12,7 +12,10 @@
# For convenient debugging
import logging
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
from framework import sorting
from framework import framework_views
diff --git a/framework/test/warmup_test.py b/framework/test/warmup_test.py
index 8140fc7..13223f1 100644
--- a/framework/test/warmup_test.py
+++ b/framework/test/warmup_test.py
@@ -10,26 +10,39 @@
import unittest
-from testing import testing_helpers
+import flask
-from framework import sql
from framework import warmup
-from services import service_manager
class WarmupTest(unittest.TestCase):
- def setUp(self):
- #self.cache_manager = cachemanager_svc.CacheManager()
- #self.services = service_manager.Services(
- # cache_manager=self.cache_manager)
- self.services = service_manager.Services()
- self.servlet = warmup.Warmup('req', 'res', services=self.services)
+ def testHandleWarmup(self):
+ app = flask.Flask(__name__)
+ app.add_url_rule('/', view_func=warmup.Warmup)
+ with app.test_client() as client:
+ response = client.get('/')
- def testHandleRequest_NothingToDo(self):
- mr = testing_helpers.MakeMonorailRequest()
- actual_json_data = self.servlet.HandleRequest(mr)
- self.assertEqual(
- {'success': 1},
- actual_json_data)
+ self.assertEqual(response.status_code, 200)
+ self.assertEqual(response.data, '')
+
+ def testHandleStart(self):
+ app = flask.Flask(__name__)
+ app.add_url_rule('/', view_func=warmup.Start)
+
+ with app.test_client() as client:
+ response = client.get('/')
+
+ self.assertEqual(response.status_code, 200)
+ self.assertEqual(response.data, '')
+
+ def testHandleStop(self):
+ app = flask.Flask(__name__)
+ app.add_url_rule('/', view_func=warmup.Stop)
+
+ with app.test_client() as client:
+ response = client.get('/')
+
+ self.assertEqual(response.status_code, 200)
+ self.assertEqual(response.data, '')
diff --git a/framework/trimvisitedpages.py b/framework/trimvisitedpages.py
index f036c10..f43ab09 100644
--- a/framework/trimvisitedpages.py
+++ b/framework/trimvisitedpages.py
@@ -11,8 +11,7 @@
from framework import jsonfeed
-# TODO: change to FlaskInternalTask when convert to Flask
-class TrimVisitedPages(jsonfeed.InternalTask):
+class TrimVisitedPages(jsonfeed.FlaskInternalTask):
"""Look for users with more than 10 visited hotlists and deletes extras."""
@@ -20,8 +19,5 @@
"""Delete old RecentHotlist2User rows when there are too many"""
self.services.user.TrimUserVisitedHotlists(mr.cnxn)
- # def GetTrimVisitedPages(self, **kwargs):
- # return self.handler(**kwargs)
-
- # def PostTrimVisitedPages(self, **kwargs):
- # return self.handler(**kwargs)
+ def GetTrimVisitedPages(self, **kwargs):
+ return self.handler(**kwargs)
diff --git a/framework/warmup.py b/framework/warmup.py
index ace76ce..a133107 100644
--- a/framework/warmup.py
+++ b/framework/warmup.py
@@ -10,55 +10,26 @@
import logging
-from framework import jsonfeed
-
-# TODO(https://crbug.com/monorail/6511): Convert to FlaskInternalTask
-class Warmup(jsonfeed.InternalTask):
+def Warmup():
"""Placeholder for warmup work. Used only to enable min_idle_instances."""
-
- def HandleRequest(self, _mr):
- """Don't do anything that could cause a jam when many instances start."""
- logging.info('/_ah/startup does nothing in Monorail.')
- logging.info('However it is needed for min_idle_instances in app.yaml.')
-
- return {
- 'success': 1,
- }
-
- # def GetWarmup(self, **kwargs):
- # return self.handler(**kwargs)
+ # Don't do anything that could cause a jam when many instances start.
+ logging.info('/_ah/startup does nothing in Monorail.')
+ logging.info('However it is needed for min_idle_instances in app.yaml.')
+ return ''
-# TODO(https://crbug.com/monorail/6511): Convert to FlaskInternalTask
-class Start(jsonfeed.InternalTask):
+def Start():
"""Placeholder for start work. Used only to enable manual_scaling."""
-
- def HandleRequest(self, _mr):
- """Don't do anything that could cause a jam when many instances start."""
- logging.info('/_ah/start does nothing in Monorail.')
- logging.info('However it is needed for manual_scaling in app.yaml.')
-
- return {
- 'success': 1,
- }
-
- # def GetStart(self, **kwargs):
- # return self.handler(**kwargs)
+ # Don't do anything that could cause a jam when many instances start.
+ logging.info('/_ah/start does nothing in Monorail.')
+ logging.info('However it is needed for manual_scaling in app.yaml.')
+ return ''
-# TODO(https://crbug.com/monorail/6511): Convert to FlaskInternalTask
-class Stop(jsonfeed.InternalTask):
+def Stop():
"""Placeholder for stop work. Used only to enable manual_scaling."""
-
- def HandleRequest(self, _mr):
- """Don't do anything that could cause a jam when many instances start."""
- logging.info('/_ah/stop does nothing in Monorail.')
- logging.info('However it is needed for manual_scaling in app.yaml.')
-
- return {
- 'success': 1,
- }
-
- # def GetStop(self, **kwargs):
- # return self.handler(**kwargs)
+ # Don't do anything that could cause a jam when many instances start."""
+ logging.info('/_ah/stop does nothing in Monorail.')
+ logging.info('However it is needed for manual_scaling in app.yaml.')
+ return ''
diff --git a/monorailapp.py b/monorailapp.py
index fd7c259..a6fa6ba 100644
--- a/monorailapp.py
+++ b/monorailapp.py
@@ -32,9 +32,7 @@
app_routes = registry.Register(services)
app = webapp2.WSGIApplication(
app_routes, config={'services': services})
-# TODO(crbug.com/1322775) Migrate away from the shared prodx-mon-chrome-infra
-# service account and change to gae_ts_mon.initialize_prod()
-gae_ts_mon.initialize_adhoc(app)
+gae_ts_mon.initialize_prod(app)
flask_regist = flaskregisterpages.ServletRegistry()
app = dispatcher.DispatcherMiddleware(
@@ -42,15 +40,16 @@
{
'/hosting_old': flask_regist.RegisterOldHostUrl(services),
'/projects': flask_regist.RegisterRedirectProjectUrl(),
- # '/_': flask_regist.RegisterMONSetUrl(services),
- # '/hosting': flask_regist.RegisterHostingUrl(services),
- # '/g': flask_regist.RegisterGroupUrls(services),
+ '/csp': flask_regist.RegisterCspUrl(),
+ '/_': flask_regist.RegisterMONSetUrl(services),
+ '/hosting': flask_regist.RegisterHostingUrl(services),
+ '/g': flask_regist.RegisterGroupUrls(services),
# '/p': flask_regist.RegisterProjectUrls(services),
# '/u': flask_regist.RegisterUserUrls(services),
- # '/_task': flask_regist.RegisterTaskUrl(services),
- # '/_cron': flask_regist.RegisterCronUrl(services),
- # '/_backend': flask_regist.RegisterBackendUrl(services),
- # '/_ah': flask_regist.RegisterAHUrl(services),
+ '/_task': flask_regist.RegisterTaskUrl(services),
+ '/_cron': flask_regist.RegisterCronUrl(services),
+ '/_backend': flask_regist.RegisterBackendUrl(services),
+ '/_ah': flask_regist.RegisterAHUrl(services),
})
endpoints = endpoints_webapp2.api_server(
diff --git a/package-lock.json b/package-lock.json
index d24774a..ba391b7 100644
--- a/package-lock.json
+++ b/package-lock.json
@@ -23,7 +23,7 @@
"chart.js": "^2.9.4",
"debounce": "^1.2.1",
"diff": "^5.0.0",
- "dompurify": "2.3.6",
+ "dompurify": "2.3.10",
"lit-element": "^2.5.1",
"lit-html": "^1.4.1",
"marked": "^4.0.14",
@@ -2076,6 +2076,21 @@
"node": ">=8"
}
},
+ "node_modules/@jridgewell/gen-mapping": {
+ "version": "0.3.2",
+ "resolved": "https://npm.skia.org/chops-monorail/@jridgewell%2fgen-mapping/-/gen-mapping-0.3.2.tgz",
+ "integrity": "sha512-mh65xKQAzI6iBcFzwv28KVWSmCkdRBWoOh+bYQGW3+6OZvbbN3TqMGo5hqYxQniRcH9F2VZIoJCm4pa3BPDK/A==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "@jridgewell/set-array": "^1.0.1",
+ "@jridgewell/sourcemap-codec": "^1.4.10",
+ "@jridgewell/trace-mapping": "^0.3.9"
+ },
+ "engines": {
+ "node": ">=6.0.0"
+ }
+ },
"node_modules/@jridgewell/resolve-uri": {
"version": "3.0.5",
"resolved": "https://npm.skia.org/chops-monorail/@jridgewell/resolve-uri/-/resolve-uri-3.0.5.tgz",
@@ -2084,6 +2099,27 @@
"node": ">=6.0.0"
}
},
+ "node_modules/@jridgewell/set-array": {
+ "version": "1.1.2",
+ "resolved": "https://npm.skia.org/chops-monorail/@jridgewell%2fset-array/-/set-array-1.1.2.tgz",
+ "integrity": "sha512-xnkseuNADM0gt2bs+BvhO0p78Mk762YnZdsuzFV018NoG1Sj1SCQvpSqa7XUaTam5vAGasABV9qXASMKnFMwMw==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=6.0.0"
+ }
+ },
+ "node_modules/@jridgewell/source-map": {
+ "version": "0.3.2",
+ "resolved": "https://npm.skia.org/chops-monorail/@jridgewell%2fsource-map/-/source-map-0.3.2.tgz",
+ "integrity": "sha512-m7O9o2uR8k2ObDysZYzdfhb08VuEml5oWGiosa1VdaPZ/A6QyPkAJuwN0Q1lhULOf6B7MtQmHENS743hWtCrgw==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "@jridgewell/gen-mapping": "^0.3.0",
+ "@jridgewell/trace-mapping": "^0.3.9"
+ }
+ },
"node_modules/@jridgewell/sourcemap-codec": {
"version": "1.4.11",
"resolved": "https://npm.skia.org/chops-monorail/@jridgewell/sourcemap-codec/-/sourcemap-codec-1.4.11.tgz",
@@ -4527,9 +4563,10 @@
}
},
"node_modules/dompurify": {
- "version": "2.3.6",
- "resolved": "https://npm.skia.org/chops-monorail/dompurify/-/dompurify-2.3.6.tgz",
- "integrity": "sha512-OFP2u/3T1R5CEgWCEONuJ1a5+MFKnOYpkywpUSxv/dj1LeBT1erK+JwM7zK0ROy2BRhqVCf0LRw/kHqKuMkVGg=="
+ "version": "2.3.10",
+ "resolved": "https://npm.skia.org/chops-monorail/dompurify/-/dompurify-2.3.10.tgz",
+ "integrity": "sha512-o7Fg/AgC7p/XpKjf/+RC3Ok6k4St5F7Q6q6+Nnm3p2zGWioAY6dh0CbbuwOhH2UcSzKsdniE/YnE2/92JcsA+g==",
+ "license": "(MPL-2.0 OR Apache-2.0)"
},
"node_modules/domutils": {
"version": "2.8.0",
@@ -7459,9 +7496,10 @@
}
},
"node_modules/moment": {
- "version": "2.29.3",
- "resolved": "https://npm.skia.org/chops-monorail/moment/-/moment-2.29.3.tgz",
- "integrity": "sha512-c6YRvhEo//6T2Jz/vVtYzqBzwvPT95JBQ+smCytzf7c50oMZRsR/a4w88aD34I+/QVSfnoAnSBFPJHItlOMJVw==",
+ "version": "2.29.4",
+ "resolved": "https://npm.skia.org/chops-monorail/moment/-/moment-2.29.4.tgz",
+ "integrity": "sha512-5LC9SOxjSc2HF6vO2CyuTDNivEdoz2IvyJJGj6X8DJ0eFyfszE0QiEd+iXmBvUP3WHxSjFH/vIsA0EN00cgr8w==",
+ "license": "MIT",
"engines": {
"node": "*"
}
@@ -9288,14 +9326,15 @@
}
},
"node_modules/terser": {
- "version": "5.12.1",
- "resolved": "https://npm.skia.org/chops-monorail/terser/-/terser-5.12.1.tgz",
- "integrity": "sha512-NXbs+7nisos5E+yXwAD+y7zrcTkMqb0dEJxIGtSKPdCBzopf7ni4odPul2aechpV7EXNvOudYOX2bb5tln1jbQ==",
+ "version": "5.14.2",
+ "resolved": "https://npm.skia.org/chops-monorail/terser/-/terser-5.14.2.tgz",
+ "integrity": "sha512-oL0rGeM/WFQCUd0y2QrWxYnq7tfSuKBiqTjRPWrRgB46WD/kiwHwF8T23z78H6Q6kGCuuHcPB+KULHRdxvVGQA==",
"dev": true,
+ "license": "BSD-2-Clause",
"dependencies": {
+ "@jridgewell/source-map": "^0.3.2",
"acorn": "^8.5.0",
"commander": "^2.20.0",
- "source-map": "~0.7.2",
"source-map-support": "~0.5.20"
},
"bin": {
@@ -9372,15 +9411,6 @@
"integrity": "sha512-GpVkmM8vF2vQUkj2LvZmD35JxeJOLCwJ9cUkugyk2nuhbv3+mJvpLYYt+0+USMxE+oj+ey/lJEnhZw75x/OMcQ==",
"dev": true
},
- "node_modules/terser/node_modules/source-map": {
- "version": "0.7.3",
- "resolved": "https://npm.skia.org/chops-monorail/source-map/-/source-map-0.7.3.tgz",
- "integrity": "sha512-CkCj6giN3S+n9qrYiBTX5gystlENnRW5jZeNLHpe6aue+SrHcG5VYwujhW9s4dY31mEGsxBDrHR6oI69fTXsaQ==",
- "dev": true,
- "engines": {
- "node": ">= 8"
- }
- },
"node_modules/text-table": {
"version": "0.2.0",
"resolved": "https://npm.skia.org/chops-monorail/text-table/-/text-table-0.2.0.tgz",
@@ -11629,11 +11659,38 @@
"integrity": "sha512-ZXRY4jNvVgSVQ8DL3LTcakaAtXwTVUxE81hslsyD2AtoXW/wVob10HkOJ1X/pAlcI7D+2YoZKg5do8G/w6RYgA==",
"dev": true
},
+ "@jridgewell/gen-mapping": {
+ "version": "0.3.2",
+ "resolved": "https://npm.skia.org/chops-monorail/@jridgewell%2fgen-mapping/-/gen-mapping-0.3.2.tgz",
+ "integrity": "sha512-mh65xKQAzI6iBcFzwv28KVWSmCkdRBWoOh+bYQGW3+6OZvbbN3TqMGo5hqYxQniRcH9F2VZIoJCm4pa3BPDK/A==",
+ "dev": true,
+ "requires": {
+ "@jridgewell/set-array": "^1.0.1",
+ "@jridgewell/sourcemap-codec": "^1.4.10",
+ "@jridgewell/trace-mapping": "^0.3.9"
+ }
+ },
"@jridgewell/resolve-uri": {
"version": "3.0.5",
"resolved": "https://npm.skia.org/chops-monorail/@jridgewell/resolve-uri/-/resolve-uri-3.0.5.tgz",
"integrity": "sha512-VPeQ7+wH0itvQxnG+lIzWgkysKIr3L9sslimFW55rHMdGu/qCQ5z5h9zq4gI8uBtqkpHhsF4Z/OwExufUCThew=="
},
+ "@jridgewell/set-array": {
+ "version": "1.1.2",
+ "resolved": "https://npm.skia.org/chops-monorail/@jridgewell%2fset-array/-/set-array-1.1.2.tgz",
+ "integrity": "sha512-xnkseuNADM0gt2bs+BvhO0p78Mk762YnZdsuzFV018NoG1Sj1SCQvpSqa7XUaTam5vAGasABV9qXASMKnFMwMw==",
+ "dev": true
+ },
+ "@jridgewell/source-map": {
+ "version": "0.3.2",
+ "resolved": "https://npm.skia.org/chops-monorail/@jridgewell%2fsource-map/-/source-map-0.3.2.tgz",
+ "integrity": "sha512-m7O9o2uR8k2ObDysZYzdfhb08VuEml5oWGiosa1VdaPZ/A6QyPkAJuwN0Q1lhULOf6B7MtQmHENS743hWtCrgw==",
+ "dev": true,
+ "requires": {
+ "@jridgewell/gen-mapping": "^0.3.0",
+ "@jridgewell/trace-mapping": "^0.3.9"
+ }
+ },
"@jridgewell/sourcemap-codec": {
"version": "1.4.11",
"resolved": "https://npm.skia.org/chops-monorail/@jridgewell/sourcemap-codec/-/sourcemap-codec-1.4.11.tgz",
@@ -13408,9 +13465,9 @@
}
},
"dompurify": {
- "version": "2.3.6",
- "resolved": "https://npm.skia.org/chops-monorail/dompurify/-/dompurify-2.3.6.tgz",
- "integrity": "sha512-OFP2u/3T1R5CEgWCEONuJ1a5+MFKnOYpkywpUSxv/dj1LeBT1erK+JwM7zK0ROy2BRhqVCf0LRw/kHqKuMkVGg=="
+ "version": "2.3.10",
+ "resolved": "https://npm.skia.org/chops-monorail/dompurify/-/dompurify-2.3.10.tgz",
+ "integrity": "sha512-o7Fg/AgC7p/XpKjf/+RC3Ok6k4St5F7Q6q6+Nnm3p2zGWioAY6dh0CbbuwOhH2UcSzKsdniE/YnE2/92JcsA+g=="
},
"domutils": {
"version": "2.8.0",
@@ -15584,9 +15641,9 @@
}
},
"moment": {
- "version": "2.29.3",
- "resolved": "https://npm.skia.org/chops-monorail/moment/-/moment-2.29.3.tgz",
- "integrity": "sha512-c6YRvhEo//6T2Jz/vVtYzqBzwvPT95JBQ+smCytzf7c50oMZRsR/a4w88aD34I+/QVSfnoAnSBFPJHItlOMJVw=="
+ "version": "2.29.4",
+ "resolved": "https://npm.skia.org/chops-monorail/moment/-/moment-2.29.4.tgz",
+ "integrity": "sha512-5LC9SOxjSc2HF6vO2CyuTDNivEdoz2IvyJJGj6X8DJ0eFyfszE0QiEd+iXmBvUP3WHxSjFH/vIsA0EN00cgr8w=="
},
"mousetrap": {
"version": "1.6.5",
@@ -16923,14 +16980,14 @@
}
},
"terser": {
- "version": "5.12.1",
- "resolved": "https://npm.skia.org/chops-monorail/terser/-/terser-5.12.1.tgz",
- "integrity": "sha512-NXbs+7nisos5E+yXwAD+y7zrcTkMqb0dEJxIGtSKPdCBzopf7ni4odPul2aechpV7EXNvOudYOX2bb5tln1jbQ==",
+ "version": "5.14.2",
+ "resolved": "https://npm.skia.org/chops-monorail/terser/-/terser-5.14.2.tgz",
+ "integrity": "sha512-oL0rGeM/WFQCUd0y2QrWxYnq7tfSuKBiqTjRPWrRgB46WD/kiwHwF8T23z78H6Q6kGCuuHcPB+KULHRdxvVGQA==",
"dev": true,
"requires": {
+ "@jridgewell/source-map": "^0.3.2",
"acorn": "^8.5.0",
"commander": "^2.20.0",
- "source-map": "~0.7.2",
"source-map-support": "~0.5.20"
},
"dependencies": {
@@ -16939,12 +16996,6 @@
"resolved": "https://npm.skia.org/chops-monorail/commander/-/commander-2.20.3.tgz",
"integrity": "sha512-GpVkmM8vF2vQUkj2LvZmD35JxeJOLCwJ9cUkugyk2nuhbv3+mJvpLYYt+0+USMxE+oj+ey/lJEnhZw75x/OMcQ==",
"dev": true
- },
- "source-map": {
- "version": "0.7.3",
- "resolved": "https://npm.skia.org/chops-monorail/source-map/-/source-map-0.7.3.tgz",
- "integrity": "sha512-CkCj6giN3S+n9qrYiBTX5gystlENnRW5jZeNLHpe6aue+SrHcG5VYwujhW9s4dY31mEGsxBDrHR6oI69fTXsaQ==",
- "dev": true
}
}
},
diff --git a/package.json b/package.json
index 95d9129..db3a7d0 100644
--- a/package.json
+++ b/package.json
@@ -79,7 +79,7 @@
"chart.js": "^2.9.4",
"debounce": "^1.2.1",
"diff": "^5.0.0",
- "dompurify": "2.3.6",
+ "dompurify": "2.3.10",
"lit-element": "^2.5.1",
"lit-html": "^1.4.1",
"marked": "^4.0.14",
diff --git a/project/test/projectupdates_test.py b/project/test/projectupdates_test.py
index c2542e8..e4c5cea 100644
--- a/project/test/projectupdates_test.py
+++ b/project/test/projectupdates_test.py
@@ -10,7 +10,10 @@
import unittest
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
from features import activities
from project import projectupdates
diff --git a/registerpages.py b/registerpages.py
index 2e0d6ab..ed1f04c 100644
--- a/registerpages.py
+++ b/registerpages.py
@@ -17,7 +17,6 @@
from features import autolink
from features import dateaction
from features import banspammer
-from features import hotlistcreate
from features import hotlistdetails
from features import hotlistissues
from features import hotlistissuescsv
@@ -25,21 +24,16 @@
from features import filterrules
from features import pubsub
from features import userhotlists
-from features import inboundemail
from features import notify
from features import rerankhotlist
from features import savedqueries
-from framework import banned, excessiveactivity
-from framework import clientmon
from framework import csp_report
from framework import deleteusers
from framework import trimvisitedpages
from framework import reap
from framework import registerpages_helpers
-from framework import ts_mon_js
from framework import urls
-from framework import warmup
from project import peopledetail
from project import peoplelist
@@ -51,21 +45,11 @@
from project import projectupdates
from project import redirects
-from search import backendnonviewable
-from search import backendsearch
-
from services import cachemanager_svc
from services import client_config_svc
from sitewide import custom_404
-from sitewide import groupadmin
-from sitewide import groupcreate
-from sitewide import groupdetail
-from sitewide import grouplist
-from sitewide import moved
-from sitewide import projectcreate
from sitewide import userprofile
-from sitewide import usersettings
from sitewide import userclearbouncing
from sitewide import userupdates
@@ -158,13 +142,11 @@
def Register(self, services):
"""Register all the monorail request handlers."""
- self._RegisterFrameworkHandlers()
self._RegisterSitewideHandlers()
self._RegisterProjectHandlers()
self._RegisterIssueHandlers()
self._RegisterWebComponentsHanders()
self._RegisterRedirects()
- self._RegisterInboundMail()
# Register pRPC API routes
prpc_server = prpc.Server(
@@ -181,10 +163,6 @@
def _RegisterProjectHandlers(self):
"""Register page and form handlers that operate within a project."""
- self._SetupServlets({
- # Note: the following are at URLS that are not externally accessible.
- urls.NOTIFY_RULES_DELETED_TASK: notify.NotifyRulesDeletedTask,
- })
self._SetupProjectServlets(
{
urls.ADMIN_INTRO: projectsummary.ProjectSummary,
@@ -199,24 +177,6 @@
def _RegisterIssueHandlers(self):
"""Register page and form handlers for the issue tracker."""
- self._SetupServlets({
- # Note: the following are at URLs that are not externaly accessible.
- urls.BACKEND_SEARCH: backendsearch.BackendSearch,
- urls.BACKEND_NONVIEWABLE: backendnonviewable.BackendNonviewable,
- urls.RECOMPUTE_DERIVED_FIELDS_TASK:
- filterrules.RecomputeDerivedFieldsTask,
- urls.REINDEX_QUEUE_CRON: filterrules.ReindexQueueCron,
- urls.NOTIFY_ISSUE_CHANGE_TASK: notify.NotifyIssueChangeTask,
- urls.NOTIFY_BLOCKING_CHANGE_TASK: notify.NotifyBlockingChangeTask,
- urls.NOTIFY_BULK_CHANGE_TASK: notify.NotifyBulkChangeTask,
- urls.NOTIFY_APPROVAL_CHANGE_TASK: notify.NotifyApprovalChangeTask,
- urls.OUTBOUND_EMAIL_TASK: notify.OutboundEmailTask,
- urls.DATE_ACTION_CRON: dateaction.DateActionCron,
- urls.PUBLISH_PUBSUB_ISSUE_CHANGE_TASK:
- pubsub.PublishPubsubIssueChangeTask,
- urls.ISSUE_DATE_ACTION_TASK: dateaction.IssueDateActionTask,
- urls.FLT_ISSUE_CONVERSION_TASK: fltconversion.FLTConvertTask,
- })
self._SetupProjectServlets(
{
@@ -328,59 +288,8 @@
'/issues/': list_redir,
})
-
- def _RegisterFrameworkHandlers(self):
- """Register page and form handlers for framework functionality."""
- self._SetupServlets(
- {
- urls.CSP_REPORT:
- csp_report.CSPReportPage,
-
- # These are only shown to users if specific conditions are met.
- urls.EXCESSIVE_ACTIVITY:
- excessiveactivity.ExcessiveActivity,
- urls.BANNED:
- banned.Banned,
- urls.PROJECT_MOVED:
- moved.ProjectMoved,
-
- # These are not externally accessible
- urls.RAMCACHE_CONSOLIDATE_CRON:
- cachemanager_svc.RamCacheConsolidate,
- urls.REAP_CRON:
- reap.Reap,
- urls.LOAD_API_CLIENT_CONFIGS_CRON:
- (client_config_svc.LoadApiClientConfigs),
- urls.CLIENT_MON:
- clientmon.ClientMonitor,
- urls.TRIM_VISITED_PAGES_CRON:
- trimvisitedpages.TrimVisitedPages,
- urls.TS_MON_JS:
- ts_mon_js.MonorailTSMonJSHandler,
- urls.WARMUP:
- warmup.Warmup,
- urls.START:
- warmup.Start,
- urls.STOP:
- warmup.Stop
- })
-
def _RegisterSitewideHandlers(self):
"""Register page and form handlers that aren't associated with projects."""
- self._SetupServlets({
- urls.PROJECT_CREATE: projectcreate.ProjectCreate,
- # The user settings page is a site-wide servlet, not under /u/.
- urls.USER_SETTINGS: usersettings.UserSettings,
- urls.GROUP_CREATE: groupcreate.GroupCreate,
- urls.GROUP_LIST: grouplist.GroupList,
- urls.GROUP_DELETE: grouplist.GroupList,
- urls.HOTLIST_CREATE: hotlistcreate.HotlistCreate,
- urls.BAN_SPAMMER_TASK: banspammer.BanSpammerTask,
- urls.WIPEOUT_SYNC_CRON: deleteusers.WipeoutSyncCron,
- urls.SEND_WIPEOUT_USER_LISTS_TASK: deleteusers.SendWipeoutUserListsTask,
- urls.DELETE_WIPEOUT_USERS_TASK: deleteusers.DeleteWipeoutUsersTask,
- urls.DELETE_USERS_TASK: deleteusers.DeleteUsersTask,
- })
self._SetupUserServlets({
urls.USER_PROFILE: userprofile.UserProfile,
@@ -405,11 +314,6 @@
urls.USER_PROFILE, 'u')
self._SetupUserServlets({'': profile_redir})
- self._SetupGroupServlets({
- urls.GROUP_DETAIL: groupdetail.GroupDetail,
- urls.GROUP_ADMIN: groupadmin.GroupAdmin,
- })
-
def _RegisterWebComponentsHanders(self):
"""Register page handlers that are handled by WebComponentsPage."""
self._AddRoute('/', webcomponentspage.ProjectListPage, 'GET')
@@ -423,8 +327,6 @@
redirect = registerpages_helpers.MakeRedirect('/')
self._SetupServlets(
{
- '/hosting/': redirect,
- '/hosting': redirect,
'/p': redirect,
'/p/': redirect,
'/u': redirect,
@@ -439,26 +341,6 @@
'/people/': redirect,
})
- redirect = registerpages_helpers.MakeRedirect(urls.GROUP_LIST)
- self._SetupServlets({'/g': redirect})
-
- group_redir = registerpages_helpers.MakeRedirectInScope(
- urls.USER_PROFILE, 'g')
- self._SetupGroupServlets({'': group_redir})
-
- def _RegisterInboundMail(self):
- """Register a handler for inbound email and email bounces."""
- self.routes.append(
- webapp2.Route(
- '/_ah/mail/<project_addr:.+>',
- handler=inboundemail.InboundEmail,
- methods=['POST', 'GET']))
- self.routes.append(
- webapp2.Route(
- '/_ah/bounce',
- handler=inboundemail.BouncedEmail,
- methods=['POST', 'GET']))
-
def _RegisterErrorPages(self):
"""Register handlers for errors."""
self._AddRoute(
diff --git a/requirements.txt b/requirements.txt
index 51dec20..276e93e 100644
--- a/requirements.txt
+++ b/requirements.txt
@@ -7,6 +7,7 @@
chardet==4.0.0
Click==7.0
ezt==1.1
+fixtures==4.0.1
Flask==1.0.2
frozendict==2.0.6
google-api-core==1.31.5
@@ -32,13 +33,16 @@
Jinja2==2.10.1
MarkupSafe==1.1.1
mock==4.0.3
+mox3==1.1.0
mysqlclient==2.1.1
oauth2client==4.1.3
packaging==16.8
+pbr==5.9.0
Pillow==8.3.1
pluggy==0.13.1
proto-plus==1.20.3
protobuf==3.19.3
+protorpc==0.12.0
py==1.10.0
pyasn1==0.4.8
pyasn1-modules==0.2.8
@@ -53,4 +57,4 @@
toml==0.10.1
uritemplate==3.0.0
urllib3==1.26.4
-Werkzeug==1.0.1
+Werkzeug==1.0.1
\ No newline at end of file
diff --git a/search/backendnonviewable.py b/search/backendnonviewable.py
index 23b601c..ab751a4 100644
--- a/search/backendnonviewable.py
+++ b/search/backendnonviewable.py
@@ -37,8 +37,7 @@
NONVIEWABLE_MEMCACHE_EXPIRATION = 15 * framework_constants.SECS_PER_MINUTE
-# Change to FlaskInternalTask
-class BackendNonviewable(jsonfeed.InternalTask):
+class BackendNonviewable(jsonfeed.FlaskInternalTask):
"""JSON servlet for getting issue IDs that the specified user cannot view."""
CHECK_SAME_APP = True
@@ -137,8 +136,8 @@
return ok_iids
- # def GetBackendNonviewable(self, **kwargs):
- # return self.handler(**kwargs)
+ def GetBackendNonviewable(self, **kwargs):
+ return self.handler(**kwargs)
- # def PostBackendNonviewable(self, **kwargs):
- # return self.handler(**kwargs)
+ def PostBackendNonviewable(self, **kwargs):
+ return self.handler(**kwargs)
diff --git a/search/backendsearch.py b/search/backendsearch.py
index 1da9975..91a00dc 100644
--- a/search/backendsearch.py
+++ b/search/backendsearch.py
@@ -31,8 +31,7 @@
from tracker import tracker_constants
-# Change to FlaskInternalTask
-class BackendSearch(jsonfeed.InternalTask):
+class BackendSearch(jsonfeed.FlaskInternalTask):
"""JSON servlet for issue search in a GAE backend."""
CHECK_SAME_APP = True
@@ -76,8 +75,8 @@
'error': error_message,
}
- # def GetBackendSearch(self, **kwargs):
- # return self.handler(**kwargs)
+ def GetBackendSearch(self, **kwargs):
+ return self.handler(**kwargs)
- # def PostBackendSearch(self, **kwargs):
- # return self.handler(**kwargs)
+ def PostBackendSearch(self, **kwargs):
+ return self.handler(**kwargs)
diff --git a/search/test/backendnonviewable_test.py b/search/test/backendnonviewable_test.py
index 6c50fb7..5360a93 100644
--- a/search/test/backendnonviewable_test.py
+++ b/search/test/backendnonviewable_test.py
@@ -9,7 +9,10 @@
from __future__ import absolute_import
import unittest
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
from google.appengine.api import memcache
from google.appengine.ext import testbed
@@ -36,8 +39,7 @@
self.mr.shard_id = 2
self.mr.invalidation_timestep = 12345
- self.servlet = backendnonviewable.BackendNonviewable(
- 'req', 'res', services=self.services)
+ self.servlet = backendnonviewable.BackendNonviewable(services=self.services)
self.mox = mox.Mox()
self.testbed = testbed.Testbed()
diff --git a/search/test/backendsearch_test.py b/search/test/backendsearch_test.py
index dd5ed18..6a9a710 100644
--- a/search/test/backendsearch_test.py
+++ b/search/test/backendsearch_test.py
@@ -9,7 +9,10 @@
from __future__ import absolute_import
import unittest
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
import settings
from search import backendsearch
@@ -31,8 +34,7 @@
self.mr.specified_logged_in_user_id = 111
self.mr.specified_me_user_ids = [222]
self.mr.shard_id = 2
- self.servlet = backendsearch.BackendSearch(
- 'req', 'res', services=self.services)
+ self.servlet = backendsearch.BackendSearch(services=self.services)
self.mox = mox.Mox()
def tearDown(self):
diff --git a/search/test/backendsearchpipeline_test.py b/search/test/backendsearchpipeline_test.py
index 212f5a6..dab2dba 100644
--- a/search/test/backendsearchpipeline_test.py
+++ b/search/test/backendsearchpipeline_test.py
@@ -8,7 +8,10 @@
from __future__ import division
from __future__ import absolute_import
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
import unittest
from google.appengine.api import memcache
diff --git a/search/test/frontendsearchpipeline_test.py b/search/test/frontendsearchpipeline_test.py
index 432a9d1..9a94c3d 100644
--- a/search/test/frontendsearchpipeline_test.py
+++ b/search/test/frontendsearchpipeline_test.py
@@ -8,7 +8,10 @@
from __future__ import division
from __future__ import absolute_import
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
import unittest
from google.appengine.api import memcache
diff --git a/search/test/search_helpers_test.py b/search/test/search_helpers_test.py
index 5905234..b9cfb51 100644
--- a/search/test/search_helpers_test.py
+++ b/search/test/search_helpers_test.py
@@ -8,7 +8,10 @@
from __future__ import division
from __future__ import absolute_import
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
import unittest
from search import search_helpers
diff --git a/services/cachemanager_svc.py b/services/cachemanager_svc.py
index 02ad6dd..753bffa 100644
--- a/services/cachemanager_svc.py
+++ b/services/cachemanager_svc.py
@@ -44,6 +44,7 @@
import logging
from framework import jsonfeed
+from framework import logger
from framework import sql
@@ -119,19 +120,26 @@
def StoreInvalidateRows(self, cnxn, kind, keys):
"""Store rows to let all jobs know to invalidate the given keys."""
assert kind in INVALIDATE_KIND_VALUES
+ logger.log(
+ {
+ 'log_type': 'cache/invalidate/rows',
+ 'kind': kind,
+ 'count': len(keys),
+ 'keys': str(keys),
+ })
self.invalidate_tbl.InsertRows(
cnxn, ['kind', 'cache_key'], [(kind, key) for key in keys])
def StoreInvalidateAll(self, cnxn, kind):
"""Store a value to tell all jobs to invalidate all items of this kind."""
+ logger.log({'log_type': 'cache/invalidate/all', 'kind': kind})
last_timestep = self.invalidate_tbl.InsertRow(
cnxn, kind=kind, cache_key=INVALIDATE_ALL_KEYS)
self.invalidate_tbl.Delete(
cnxn, kind=kind, where=[('timestep < %s', [last_timestep])])
-# TODO: change to FlaskInternalTask when convert to Flask
-class RamCacheConsolidate(jsonfeed.InternalTask):
+class RamCacheConsolidate(jsonfeed.FlaskInternalTask):
"""Drop old Invalidate rows when there are too many of them."""
def HandleRequest(self, mr):
@@ -166,8 +174,5 @@
'new_count': new_count,
}
- # def GetRamCacheConsolidate(self, **kwargs):
- # return self.handler(**kwargs)
-
- # def PostRamCacheConsolidate(self, **kwargs):
- # return self.handler(**kwargs)
+ def GetRamCacheConsolidate(self, **kwargs):
+ return self.handler(**kwargs)
diff --git a/services/caches.py b/services/caches.py
index 35276a0..8869d61 100644
--- a/services/caches.py
+++ b/services/caches.py
@@ -27,6 +27,7 @@
import settings
from framework import framework_constants
+from framework import logger
DEFAULT_MAX_SIZE = 10000
@@ -253,6 +254,16 @@
self._WriteToMemcache(retrieved_dict)
still_missing_keys = [key for key in keys if key not in result_dict]
+ if still_missing_keys:
+ # The keys were not found in the caches or the DB.
+ logger.log(
+ {
+ 'log_type': 'database/missing_keys',
+ 'kind': self.cache.kind,
+ 'prefix': self.prefix,
+ 'count': len(still_missing_keys),
+ 'keys': str(still_missing_keys)
+ })
return result_dict, still_missing_keys
def LocalInvalidateAll(self):
@@ -329,6 +340,14 @@
def _DeleteFromMemcache(self, keys):
# type: (Sequence[str]) -> None
"""Delete key-values from memcache. """
+ logger.log(
+ {
+ 'log_type': 'cache/memcache/delete',
+ 'kind': self.cache.kind,
+ 'prefix': self.prefix,
+ 'count': len(keys),
+ 'keys': str(keys)
+ })
memcache.delete_multi(
[self._KeyToStr(key) for key in keys],
seconds=5,
diff --git a/services/client_config_svc.py b/services/client_config_svc.py
index ce85a95..d5d6a25 100644
--- a/services/client_config_svc.py
+++ b/services/client_config_svc.py
@@ -14,6 +14,7 @@
import time
from six.moves import urllib
import webapp2
+import flask
from google.appengine.api import app_identity
from google.appengine.api import urlfetch
@@ -45,108 +46,77 @@
configs = db.TextProperty()
-# Note: The cron job must have hit the servlet before this will work.
-# when convert to flask replace the webapp2.RequestHandler to Object
-class LoadApiClientConfigs(webapp2.RequestHandler):
+_CONFIG_LOADS = ts_mon.CounterMetric(
+ 'monorail/client_config_svc/loads', 'Results of fetches from luci-config.',
+ [ts_mon.BooleanField('success'),
+ ts_mon.StringField('type')])
- config_loads = ts_mon.CounterMetric(
- 'monorail/client_config_svc/loads',
- 'Results of fetches from luci-config.',
- [ts_mon.BooleanField('success'), ts_mon.StringField('type')])
- def get(self):
- global service_account_map
- global qpm_dict
- authorization_token, _ = app_identity.get_access_token(
+def _process_response(response):
+ try:
+ content = json.loads(response.content)
+ except ValueError:
+ logging.error('Response was not JSON: %r', response.content)
+ _CONFIG_LOADS.increment({'success': False, 'type': 'json-load-error'})
+ raise
+
+ try:
+ config_content = content['content']
+ except KeyError:
+ logging.error('JSON contained no content: %r', content)
+ _CONFIG_LOADS.increment({'success': False, 'type': 'json-key-error'})
+ raise
+
+ try:
+ content_text = base64.b64decode(config_content)
+ except TypeError:
+ logging.error('Content was not b64: %r', config_content)
+ _CONFIG_LOADS.increment({'success': False, 'type': 'b64-decode-error'})
+ raise
+
+ try:
+ cfg = api_clients_config_pb2.ClientCfg()
+ text_format.Merge(content_text, cfg)
+ except:
+ logging.error('Content was not a valid ClientCfg proto: %r', content_text)
+ _CONFIG_LOADS.increment({'success': False, 'type': 'proto-load-error'})
+ raise
+
+ return content_text
+
+
+def GetLoadApiClientConfigs():
+ global service_account_map
+ global qpm_dict
+ authorization_token, _ = app_identity.get_access_token(
framework_constants.OAUTH_SCOPE)
- response = urlfetch.fetch(
+ response = urlfetch.fetch(
LUCI_CONFIG_URL,
method=urlfetch.GET,
follow_redirects=False,
- headers={'Content-Type': 'application/json; charset=UTF-8',
- 'Authorization': 'Bearer ' + authorization_token})
+ headers={
+ 'Content-Type': 'application/json; charset=UTF-8',
+ 'Authorization': 'Bearer ' + authorization_token
+ })
- if response.status_code != 200:
- logging.error('Invalid response from luci-config: %r', response)
- self.config_loads.increment({'success': False, 'type': 'luci-cfg-error'})
- self.abort(500, 'Invalid response from luci-config')
+ if response.status_code != 200:
+ logging.error('Invalid response from luci-config: %r', response)
+ _CONFIG_LOADS.increment({'success': False, 'type': 'luci-cfg-error'})
+ flask.abort(500, 'Invalid response from luci-config')
- try:
- content_text = self._process_response(response)
- except Exception as e:
- self.abort(500, str(e))
+ try:
+ content_text = _process_response(response)
+ except Exception as e:
+ flask.abort(500, str(e))
- logging.info('luci-config content decoded: %r.', content_text)
- configs = ClientConfig(configs=content_text,
- key_name='api_client_configs')
- configs.put()
- service_account_map = None
- qpm_dict = None
- self.config_loads.increment({'success': True, 'type': 'success'})
+ logging.info('luci-config content decoded: %r.', content_text)
+ configs = ClientConfig(configs=content_text, key_name='api_client_configs')
+ configs.put()
+ service_account_map = None
+ qpm_dict = None
+ _CONFIG_LOADS.increment({'success': True, 'type': 'success'})
- def _process_response(self, response):
- try:
- content = json.loads(response.content)
- except ValueError:
- logging.error('Response was not JSON: %r', response.content)
- self.config_loads.increment({'success': False, 'type': 'json-load-error'})
- raise
-
- try:
- config_content = content['content']
- except KeyError:
- logging.error('JSON contained no content: %r', content)
- self.config_loads.increment({'success': False, 'type': 'json-key-error'})
- raise
-
- try:
- content_text = base64.b64decode(config_content)
- except TypeError:
- logging.error('Content was not b64: %r', config_content)
- self.config_loads.increment({'success': False,
- 'type': 'b64-decode-error'})
- raise
-
- try:
- cfg = api_clients_config_pb2.ClientCfg()
- text_format.Merge(content_text, cfg)
- except:
- logging.error('Content was not a valid ClientCfg proto: %r', content_text)
- self.config_loads.increment({'success': False,
- 'type': 'proto-load-error'})
- raise
-
- return content_text
-
- # def GetLoadApiClientConfigs(self):
- # global service_account_map
- # global qpm_dict
- # authorization_token, _ = app_identity.get_access_token(
- # framework_constants.OAUTH_SCOPE)
- # response = urlfetch.fetch(
- # LUCI_CONFIG_URL,
- # method=urlfetch.GET,
- # follow_redirects=False,
- # headers={'Content-Type': 'application/json; charset=UTF-8',
- # 'Authorization': 'Bearer ' + authorization_token})
-
- # if response.status_code != 200:
- # logging.error('Invalid response from luci-config: %r', response)
- # self.config_loads.increment({'success': False, 'type': 'luci-cfg-error'})
- # flask.abort(500, 'Invalid response from luci-config')
-
- # try:
- # content_text = self._process_response(response)
- # except Exception as e:
- # flask.abort(500, str(e))
-
- # logging.info('luci-config content decoded: %r.', content_text)
- # configs = ClientConfig(configs=content_text,
- # key_name='api_client_configs')
- # configs.put()
- # service_account_map = None
- # qpm_dict = None
- # self.config_loads.increment({'success': True, 'type': 'success'})
+ return ''
class ClientConfigService(object):
diff --git a/services/test/cachemanager_svc_test.py b/services/test/cachemanager_svc_test.py
index 20956e0..b84d33e 100644
--- a/services/test/cachemanager_svc_test.py
+++ b/services/test/cachemanager_svc_test.py
@@ -10,7 +10,10 @@
import unittest
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
from framework import sql
from services import cachemanager_svc
@@ -168,8 +171,7 @@
sql.SQLTableManager)
self.services = service_manager.Services(
cache_manager=self.cache_manager)
- self.servlet = cachemanager_svc.RamCacheConsolidate(
- 'req', 'res', services=self.services)
+ self.servlet = cachemanager_svc.RamCacheConsolidate(services=self.services)
def testHandleRequest_NothingToDo(self):
mr = testing_helpers.MakeMonorailRequest()
diff --git a/services/test/chart_svc_test.py b/services/test/chart_svc_test.py
index fbd87df..470bc80 100644
--- a/services/test/chart_svc_test.py
+++ b/services/test/chart_svc_test.py
@@ -10,7 +10,10 @@
from __future__ import absolute_import
import datetime
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
import re
import settings
import unittest
diff --git a/services/test/client_config_svc_test.py b/services/test/client_config_svc_test.py
index 5e9b87a..d8a305e 100644
--- a/services/test/client_config_svc_test.py
+++ b/services/test/client_config_svc_test.py
@@ -20,35 +20,32 @@
def __init__(self, content):
self.content = content
- def setUp(self):
- self.handler = client_config_svc.LoadApiClientConfigs()
-
def testProcessResponse_InvalidJSON(self):
r = self.FakeResponse('}{')
with self.assertRaises(ValueError):
- self.handler._process_response(r)
+ client_config_svc._process_response(r)
def testProcessResponse_NoContent(self):
r = self.FakeResponse('{"wrong-key": "some-value"}')
with self.assertRaises(KeyError):
- self.handler._process_response(r)
+ client_config_svc._process_response(r)
def testProcessResponse_NotB64(self):
# 'asd' is not a valid base64-encoded string.
r = self.FakeResponse('{"content": "asd"}')
with self.assertRaises(TypeError):
- self.handler._process_response(r)
+ client_config_svc._process_response(r)
def testProcessResponse_NotProto(self):
# 'asdf' is a valid base64-encoded string.
r = self.FakeResponse('{"content": "asdf"}')
with self.assertRaises(Exception):
- self.handler._process_response(r)
+ client_config_svc._process_response(r)
def testProcessResponse_Success(self):
with open(client_config_svc.CONFIG_FILE_PATH) as f:
r = self.FakeResponse('{"content": "%s"}' % base64.b64encode(f.read()))
- c = self.handler._process_response(r)
+ c = client_config_svc._process_response(r)
assert '123456789.apps.googleusercontent.com' in c
diff --git a/services/test/config_svc_test.py b/services/test/config_svc_test.py
index 6d1d941..dd2796c 100644
--- a/services/test/config_svc_test.py
+++ b/services/test/config_svc_test.py
@@ -13,7 +13,10 @@
import logging
import mock
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
from google.appengine.api import memcache
from google.appengine.ext import testbed
diff --git a/services/test/features_svc_test.py b/services/test/features_svc_test.py
index c80b819..d285152 100644
--- a/services/test/features_svc_test.py
+++ b/services/test/features_svc_test.py
@@ -9,7 +9,10 @@
from __future__ import absolute_import
import logging
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
import time
import unittest
import mock
diff --git a/services/test/fulltext_helpers_test.py b/services/test/fulltext_helpers_test.py
index 1e4f0c9..fbff1b8 100644
--- a/services/test/fulltext_helpers_test.py
+++ b/services/test/fulltext_helpers_test.py
@@ -10,7 +10,10 @@
import unittest
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
from google.appengine.api import search
diff --git a/services/test/issue_svc_test.py b/services/test/issue_svc_test.py
index b6fe682..fe41aa4 100644
--- a/services/test/issue_svc_test.py
+++ b/services/test/issue_svc_test.py
@@ -15,7 +15,10 @@
import unittest
from mock import patch, Mock, ANY
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
from google.appengine.api import search
from google.appengine.ext import testbed
diff --git a/services/test/project_svc_test.py b/services/test/project_svc_test.py
index 2eb7a2b..48de180 100644
--- a/services/test/project_svc_test.py
+++ b/services/test/project_svc_test.py
@@ -11,7 +11,10 @@
import time
import unittest
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
import mock
from google.appengine.ext import testbed
diff --git a/services/test/spam_svc_test.py b/services/test/spam_svc_test.py
index 67b53cf..351ec62 100644
--- a/services/test/spam_svc_test.py
+++ b/services/test/spam_svc_test.py
@@ -11,7 +11,10 @@
import mock
import unittest
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
from google.appengine.ext import testbed
diff --git a/services/test/star_svc_test.py b/services/test/star_svc_test.py
index 03a0d23..3a5ce74 100644
--- a/services/test/star_svc_test.py
+++ b/services/test/star_svc_test.py
@@ -10,7 +10,10 @@
import unittest
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
import mock
from google.appengine.ext import testbed
diff --git a/services/test/tracker_fulltext_test.py b/services/test/tracker_fulltext_test.py
index db8a7a7..a4c935e 100644
--- a/services/test/tracker_fulltext_test.py
+++ b/services/test/tracker_fulltext_test.py
@@ -10,7 +10,10 @@
import unittest
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
from google.appengine.api import search
diff --git a/services/test/user_svc_test.py b/services/test/user_svc_test.py
index 4a8eb16..323d3eb 100644
--- a/services/test/user_svc_test.py
+++ b/services/test/user_svc_test.py
@@ -11,7 +11,10 @@
import unittest
import mock
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
import time
from google.appengine.ext import testbed
diff --git a/services/test/usergroup_svc_test.py b/services/test/usergroup_svc_test.py
index 5bfd899..10b2c8a 100644
--- a/services/test/usergroup_svc_test.py
+++ b/services/test/usergroup_svc_test.py
@@ -12,7 +12,10 @@
import mock
import unittest
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
from google.appengine.ext import testbed
diff --git a/sitewide/groupadmin.py b/sitewide/groupadmin.py
index 3e8bcce..09593e0 100644
--- a/sitewide/groupadmin.py
+++ b/sitewide/groupadmin.py
@@ -23,7 +23,7 @@
from sitewide import group_helpers
-class GroupAdmin(servlet.Servlet):
+class GroupAdmin(flaskservlet.FlaskServlet):
"""The group admin page."""
_PAGE_TEMPLATE = 'sitewide/group-admin-page.ezt'
@@ -123,8 +123,8 @@
mr, '/g/%s%s' % (group_name, urls.GROUP_ADMIN),
include_project=False, saved=1, ts=int(time.time()))
- # def GetGroupAdmin(self, **kwargs):
- # return self.handler(**kwargs)
+ def GetGroupAdmin(self, **kwargs):
+ return self.handler(**kwargs)
- # def PostGroupAdmin(self, **kwargs):
- # return self.handler(**kwargs)
+ def PostGroupAdmin(self, **kwargs):
+ return self.handler(**kwargs)
diff --git a/sitewide/groupcreate.py b/sitewide/groupcreate.py
index ce0f151..2c0ece7 100644
--- a/sitewide/groupcreate.py
+++ b/sitewide/groupcreate.py
@@ -14,12 +14,11 @@
from framework import exceptions, flaskservlet
from framework import framework_helpers
from framework import permissions
-from framework import servlet
from proto import usergroup_pb2
from sitewide import group_helpers
-class GroupCreate(servlet.Servlet):
+class GroupCreate(flaskservlet.FlaskServlet):
"""Shows a page with a simple form to create a user group."""
_PAGE_TEMPLATE = 'sitewide/group-create-page.ezt'
@@ -103,8 +102,8 @@
return framework_helpers.FormatAbsoluteURL(
mr, '/g/%s/' % group_id, include_project=False)
- # def GetGroupCreate(self, **kwargs):
- # return self.handler(**kwargs)
+ def GetGroupCreate(self, **kwargs):
+ return self.handler(**kwargs)
- # def PostGroupCreate(self, **kwargs):
- # return self.handler(**kwargs)
+ def PostGroupCreate(self, **kwargs):
+ return self.handler(**kwargs)
diff --git a/sitewide/groupdetail.py b/sitewide/groupdetail.py
index 4428a53..1f16a0a 100644
--- a/sitewide/groupdetail.py
+++ b/sitewide/groupdetail.py
@@ -28,7 +28,7 @@
MEMBERS_PER_PAGE = 50
-class GroupDetail(servlet.Servlet):
+class GroupDetail(flaskservlet.FlaskServlet):
"""The group detail page presents information about one user group."""
_PAGE_TEMPLATE = 'sitewide/group-detail-page.ezt'
@@ -189,9 +189,7 @@
String URL to redirect the user to after processing.
"""
# 1. Gather data from the request.
- remove_strs = post_data.getall('remove')
- # TODO(crbug.com/monorail/10936): getall in Flask is getlist
- # remove_strs = post_data.getlist('remove')
+ remove_strs = post_data.getlist('remove')
logging.info('remove_strs = %r', remove_strs)
if not remove_strs:
@@ -212,8 +210,8 @@
mr, '/g/%s/' % mr.viewed_username, include_project=False,
saved=1, ts=int(time.time()))
- # def GetGroupDetail(self, **kwargs):
- # return self.handler(**kwargs)
+ def GetGroupDetail(self, **kwargs):
+ return self.handler(**kwargs)
- # def PostGroupDetail(self, **kwargs):
- # return self.handler(**kwargs)
+ def PostGroupDetail(self, **kwargs):
+ return self.handler(**kwargs)
diff --git a/sitewide/grouplist.py b/sitewide/grouplist.py
index 57c46e9..627ca22 100644
--- a/sitewide/grouplist.py
+++ b/sitewide/grouplist.py
@@ -86,14 +86,14 @@
# return self.handler(**kwargs)
-class GroupDelete(flaskservlet.FlaskServlet):
+class FlaskGroupList(flaskservlet.FlaskServlet):
"""Shows a page with a simple form to create a user group."""
_PAGE_TEMPLATE = 'sitewide/group-list-page.ezt'
def AssertBasePermission(self, mr):
"""Assert that the user has the permissions needed to view this page."""
- super(GroupDelete, self).AssertBasePermission(mr)
+ super(FlaskGroupList, self).AssertBasePermission(mr)
if not mr.perms.HasPerm(permissions.VIEW_GROUP, None, None):
raise permissions.PermissionException(
@@ -150,3 +150,6 @@
def PostGroupDelete(self, **kwargs):
return self.handler(**kwargs)
+
+ def GetGroupList(self, **kwargs):
+ return self.handler(**kwargs)
diff --git a/sitewide/moved.py b/sitewide/moved.py
index 968422c..8c2935f 100644
--- a/sitewide/moved.py
+++ b/sitewide/moved.py
@@ -20,7 +20,7 @@
from project import project_constants
-class ProjectMoved(servlet.Servlet):
+class ProjectMoved(flaskservlet.FlaskServlet):
"""The ProjectMoved page explains that the project has moved."""
_PAGE_TEMPLATE = 'sitewide/moved-page.ezt'
@@ -61,5 +61,5 @@
'moved_to_url': moved_to_url,
}
- # def GetProjectMoved(self, **kwargs):
- # return self.handler(**kwargs)
+ def GetProjectMoved(self, **kwargs):
+ return self.handler(**kwargs)
diff --git a/sitewide/projectcreate.py b/sitewide/projectcreate.py
index c3f8cca..37d02c8 100644
--- a/sitewide/projectcreate.py
+++ b/sitewide/projectcreate.py
@@ -37,7 +37,7 @@
_MSG_MISSING_PROJECT_SUMMARY = 'Missing project summary'
-class ProjectCreate(servlet.Servlet):
+class ProjectCreate(flaskservlet.FlaskServlet):
"""Shows a page with a simple form to create a project."""
_PAGE_TEMPLATE = 'sitewide/project-create-page.ezt'
@@ -156,8 +156,8 @@
return framework_helpers.FormatAbsoluteURL(
mr, urls.ADMIN_INTRO, project_name=project_name)
- # def GetCreateProject(self, **kwargs):
- # return self.handler(**kwargs)
+ def GetCreateProject(self, **kwargs):
+ return self.handler(**kwargs)
- # def PostCreateProject(self, **kwargs):
- # return self.handler(**kwargs)
+ def PostCreateProject(self, **kwargs):
+ return self.handler(**kwargs)
diff --git a/sitewide/test/groupadmin_test.py b/sitewide/test/groupadmin_test.py
index d1f7e0f..72dfa9d 100644
--- a/sitewide/test/groupadmin_test.py
+++ b/sitewide/test/groupadmin_test.py
@@ -34,8 +34,7 @@
self.services.usergroup.TestAddGroupSettings(888, 'group@example.com')
self.services.usergroup.TestAddGroupSettings(
999, 'importgroup@example.com', external_group_type='mdb')
- self.servlet = groupadmin.GroupAdmin(
- 'req', 'res', services=self.services)
+ self.servlet = groupadmin.GroupAdmin(services=self.services)
self.mr = testing_helpers.MakeMonorailRequest()
self.mr.viewed_username = 'group@example.com'
self.mr.viewed_user_auth.user_id = 888
diff --git a/sitewide/test/groupcreate_test.py b/sitewide/test/groupcreate_test.py
index bf7be8d..f82fb74 100644
--- a/sitewide/test/groupcreate_test.py
+++ b/sitewide/test/groupcreate_test.py
@@ -28,8 +28,7 @@
user=fake.UserService(),
usergroup=fake.UserGroupService(),
project=fake.ProjectService())
- self.servlet = groupcreate.GroupCreate(
- 'req', 'res', services=self.services)
+ self.servlet = groupcreate.GroupCreate(services=self.services)
self.mr = testing_helpers.MakeMonorailRequest()
def CheckAssertBasePermissions(
diff --git a/sitewide/test/groupdetail_test.py b/sitewide/test/groupdetail_test.py
index 4440bb8..f294606 100644
--- a/sitewide/test/groupdetail_test.py
+++ b/sitewide/test/groupdetail_test.py
@@ -31,8 +31,7 @@
self.services.user.TestAddUser('c@example.com', 333)
self.services.user.TestAddUser('group@example.com', 888)
self.services.usergroup.TestAddGroupSettings(888, 'group@example.com')
- self.servlet = groupdetail.GroupDetail(
- 'req', 'res', services=self.services)
+ self.servlet = groupdetail.GroupDetail(services=self.services)
self.mr = testing_helpers.MakeMonorailRequest()
self.mr.viewed_username = 'group@example.com'
self.mr.viewed_user_auth.user_id = 888
@@ -142,5 +141,3 @@
self.assertRaises(
exceptions.NoSuchGroupException,
self.servlet.GatherPageData, self.mr)
-
-
diff --git a/sitewide/test/moved_test.py b/sitewide/test/moved_test.py
index 04b9165..eccb195 100644
--- a/sitewide/test/moved_test.py
+++ b/sitewide/test/moved_test.py
@@ -24,7 +24,7 @@
def setUp(self):
self.services = service_manager.Services(
project=fake.ProjectService())
- self.servlet = moved.ProjectMoved('req', 'res', services=self.services)
+ self.servlet = moved.ProjectMoved(services=self.services)
self.old_project = 'old-project'
def testGatherPageData_NoProjectSpecified(self):
@@ -40,7 +40,7 @@
_, mr = testing_helpers.GetRequestObjects(
path='/hosting/moved?project=nonexistent')
- with self.assertRaises(webapp2.HTTPException) as cm:
+ with self.assertRaises(Exception) as cm:
self.servlet.GatherPageData(mr)
self.assertEqual(404, cm.exception.code)
@@ -50,7 +50,7 @@
_, mr = testing_helpers.GetRequestObjects(
path='/hosting/moved?project=%s' % self.old_project)
- with self.assertRaises(webapp2.HTTPException) as cm:
+ with self.assertRaises(Exception) as cm:
self.servlet.GatherPageData(mr)
self.assertEqual(400, cm.exception.code)
diff --git a/sitewide/test/projectcreate_test.py b/sitewide/test/projectcreate_test.py
index 8f468dd..13dc97b 100644
--- a/sitewide/test/projectcreate_test.py
+++ b/sitewide/test/projectcreate_test.py
@@ -24,7 +24,7 @@
def setUp(self):
services = service_manager.Services()
- self.servlet = projectcreate.ProjectCreate('req', 'res', services=services)
+ self.servlet = projectcreate.ProjectCreate(services=services)
def CheckAssertBasePermissions(
self, restriction, expect_admin_ok, expect_nonadmin_ok):
diff --git a/sitewide/test/usersettings_test.py b/sitewide/test/usersettings_test.py
index 54c14ae..dd7d252 100644
--- a/sitewide/test/usersettings_test.py
+++ b/sitewide/test/usersettings_test.py
@@ -10,7 +10,10 @@
import unittest
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
from framework import framework_helpers
from framework import permissions
@@ -27,8 +30,7 @@
def setUp(self):
self.mox = mox.Mox()
self.services = service_manager.Services(user=fake.UserService())
- self.servlet = usersettings.UserSettings(
- 'req', 'res', services=self.services)
+ self.servlet = usersettings.UserSettings(services=self.services)
def tearDown(self):
self.mox.UnsetStubs()
diff --git a/sitewide/test/userupdates_test.py b/sitewide/test/userupdates_test.py
index efae9bc..725b123 100644
--- a/sitewide/test/userupdates_test.py
+++ b/sitewide/test/userupdates_test.py
@@ -10,7 +10,10 @@
import unittest
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
from features import activities
from services import service_manager
@@ -112,4 +115,3 @@
self.assertEqual('st5', page_data['user_tab_mode'])
self.assertEqual('yes', page_data['viewing_user_page'])
self.assertEqual(uui._TAB_MODE, page_data['user_updates_tab_mode'])
-
diff --git a/sitewide/usersettings.py b/sitewide/usersettings.py
index 8484afc..163d731 100644
--- a/sitewide/usersettings.py
+++ b/sitewide/usersettings.py
@@ -14,14 +14,14 @@
import ezt
from businesslogic import work_env
+from framework import flaskservlet
from framework import framework_helpers
from framework import permissions
-from framework import servlet
from framework import template_helpers
from framework import urls
-class UserSettings(servlet.Servlet):
+class UserSettings(flaskservlet.FlaskServlet):
"""Shows a page with a simple form to edit user preferences."""
_PAGE_TEMPLATE = 'sitewide/user-settings-page.ezt'
@@ -66,10 +66,8 @@
# pylint: disable=unused-argument
def GetUserSetting(self, **kwargs):
- return
- # return self.handler(**kwargs)
+ return self.handler(**kwargs)
# pylint: disable=unused-argument
def PostUserSetting(self, **kwargs):
- return
- # return self.handler(**kwargs)
+ return self.handler(**kwargs)
diff --git a/static_src/elements/issue-detail/metadata/mr-edit-metadata/mr-edit-issue.js b/static_src/elements/issue-detail/metadata/mr-edit-metadata/mr-edit-issue.js
index d9cec5e..248c7d5 100644
--- a/static_src/elements/issue-detail/metadata/mr-edit-metadata/mr-edit-issue.js
+++ b/static_src/elements/issue-detail/metadata/mr-edit-metadata/mr-edit-issue.js
@@ -12,7 +12,7 @@
import {arrayToEnglish} from 'shared/helpers.js';
import './mr-edit-metadata.js';
import 'shared/typedef.js';
-
+import {migratedTypes} from 'shared/issue-fields.js';
import ClientLogger from 'monitoring/client-logger.js';
const DEBOUNCED_PRESUBMIT_TIME_OUT = 400;
@@ -45,11 +45,7 @@
class="warning-icon material-icons"
icon="warning"
>warning</i>
- <p>
- This issue has moved to
- ${this._migratedLink}. Updates should be posted in
- ${this._migratedLink}.
- </p>
+ ${this._migratedLink}
</div>
<chops-button
class="legacy-edit"
@@ -130,6 +126,12 @@
type: String,
},
/**
+ * Type of the issue migrated to.
+ */
+ migratedType: {
+ type: migratedTypes,
+ },
+ /**
* All comments, including descriptions.
*/
comments: {
@@ -211,7 +213,7 @@
/** @override */
stateChanged(state) {
this.migratedId = issueV0.migratedId(state);
-
+ this.migratedType = issueV0.migratedType(state);
this.issue = issueV0.viewedIssue(state);
this.issueRef = issueV0.viewedIssueRef(state);
this.comments = issueV0.comments(state);
@@ -347,10 +349,16 @@
}
/**
- * @return {string} the link of the issue in Issue Tracker.
+ * @return {string} the link of the issue in Issue Tracker or Launch.
*/
- get _migratedLink() {
- return html`<a href="https://issuetracker.google.com/issues/${this.migratedId}">b/${this.migratedId}</a>`;
+ get _migratedLink() {
+ if (this.migratedType === migratedTypes.BUGANIZER_TYPE) {
+ const link =
+ html`<a href="https://issuetracker.google.com/issues/${this.migratedId}">b/${this.migratedId}</a>`;
+ return html`<p>This issue has moved to ${link}. Updates should be posted in ${link}.</p>`;
+ } else {
+ return html`<p>This issue has been migrated to Launch, see link in final comment below.</p>`;
+ }
}
/**
diff --git a/static_src/elements/issue-detail/metadata/mr-edit-metadata/mr-edit-issue.test.js b/static_src/elements/issue-detail/metadata/mr-edit-metadata/mr-edit-issue.test.js
index 880064b..e781328 100644
--- a/static_src/elements/issue-detail/metadata/mr-edit-metadata/mr-edit-issue.test.js
+++ b/static_src/elements/issue-detail/metadata/mr-edit-metadata/mr-edit-issue.test.js
@@ -7,6 +7,7 @@
import {prpcClient} from 'prpc-client-instance.js';
import {MrEditIssue, allowRemovedRestrictions} from './mr-edit-issue.js';
import {clientLoggerFake} from 'shared/test/fakes.js';
+import {migratedTypes} from 'shared/issue-fields.js';
let element;
let clock;
@@ -308,16 +309,34 @@
it('shows notice if issue migrated', async () => {
element.migratedId = '1234';
-
+ element.migratedType = migratedTypes.LAUNCH_TYPE
await element.updateComplete;
assert.isNotNull(element.querySelector('.migrated-banner'));
assert.isNotNull(element.querySelector('.legacy-edit'));
});
+ it('shows buganizer link when migrated to buganizer', async () => {
+ element.migratedId = '1234';
+ element.migratedType = migratedTypes.BUGANIZER_TYPE
+ await element.updateComplete;
+
+ const link = element.querySelector('.migrated-banner a');
+ assert.include(link.textContent, 'b/1234');
+ });
+
+ it('shows launch banner when migrated to launch', async () => {
+ element.migratedId = '1234';
+ element.migratedType = migratedTypes.LAUNCH_TYPE
+ await element.updateComplete;
+
+ const link = element.querySelector('.migrated-banner');
+ assert.include(link.textContent, 'This issue has been migrated to Launch, see link in final comment below');
+ });
+
it('hides edit form if issue migrated', async () => {
element.migratedId = '1234';
-
+ element.migratedType = migratedTypes.LAUNCH_TYPE
await element.updateComplete;
const editForm = element.querySelector('mr-edit-metadata');
@@ -326,7 +345,7 @@
it('unhides edit form on button click', async () => {
element.migratedId = '1234';
-
+ element.migratedType = migratedTypes.LAUNCH_TYPE
await element.updateComplete;
const button = element.querySelector('.legacy-edit');
diff --git a/static_src/elements/issue-detail/mr-issue-page/mr-migrated-banner.js b/static_src/elements/issue-detail/mr-issue-page/mr-migrated-banner.js
index e27a5fe..fe4ba3c 100644
--- a/static_src/elements/issue-detail/mr-issue-page/mr-migrated-banner.js
+++ b/static_src/elements/issue-detail/mr-issue-page/mr-migrated-banner.js
@@ -7,7 +7,7 @@
import {connectStore} from 'reducers/base.js';
import * as issueV0 from 'reducers/issueV0.js';
import {SHARED_STYLES} from 'shared/shared-styles.js';
-
+import {migratedTypes} from 'shared/issue-fields.js';
/**
* `<mr-migrated-banner>`
@@ -57,10 +57,7 @@
class="warning-icon material-icons"
icon="warning"
>warning</i>
- <p>
- This issue has been migrated to ${this._link}. Please see
- ${this._link} for the latest version of this discussion.
- </p>
+ ${this._link}
`;
}
@@ -72,6 +69,7 @@
type: Boolean,
reflect: true,
},
+ migratedType: {type: migratedTypes}
};
}
@@ -85,6 +83,7 @@
/** @override */
stateChanged(state) {
this.migratedId = issueV0.migratedId(state);
+ this.migratedType = issueV0.migratedType(state);
}
/** @override */
@@ -100,7 +99,13 @@
* @return {string} the link of the issue in Issue Tracker.
*/
get _link() {
- return html`<a href="https://issuetracker.google.com/issues/${this.migratedId}">b/${this.migratedId}</a>`;
+ if (this.migratedType === migratedTypes.BUGANIZER_TYPE) {
+ const link =
+ html`<a href="https://issuetracker.google.com/issues/${this.migratedId}">b/${this.migratedId}</a>`;
+ return html`<p>This issue has moved to ${link}. Updates should be posted in ${link}.</p>`;
+ } else {
+ return html`<p>This issue has been migrated to Launch, see link in final comment below.</p>`;
+ }
}
}
diff --git a/static_src/elements/issue-detail/mr-issue-page/mr-migrated-banner.test.js b/static_src/elements/issue-detail/mr-issue-page/mr-migrated-banner.test.js
index 4cceb2b..2114b61 100644
--- a/static_src/elements/issue-detail/mr-issue-page/mr-migrated-banner.test.js
+++ b/static_src/elements/issue-detail/mr-issue-page/mr-migrated-banner.test.js
@@ -4,6 +4,7 @@
import {assert} from 'chai';
import {MrMigratedBanner} from './mr-migrated-banner.js';
+import {migratedTypes} from 'shared/issue-fields.js';
let element;
@@ -34,10 +35,29 @@
assert.isTrue(element.hasAttribute('hidden'));
});
- it('shows element when migratedId is set', async () => {
+ it('shows element when migratedId and migratedType is set', async () => {
element.migratedId = '1234';
+ element.migratedType = migratedTypes.BUGANIZER_TYPE
await element.updateComplete;
assert.isFalse(element.hasAttribute('hidden'));
});
-});
+
+ it('shows bugnizer link when migrate to bugnizer', async () => {
+ element.migratedId = '1234';
+ element.migratedType = migratedTypes.BUGANIZER_TYPE
+ await element.updateComplete;
+
+ const link = element.shadowRoot.querySelector('a');
+ assert.include(link.textContent, 'b/1234');
+ });
+
+ it('shows launch link when migrate to launch', async () => {
+ element.migratedId = '1234';
+ element.migratedType = migratedTypes.LAUNCH_TYPE
+ await element.updateComplete;
+
+ const link = element.shadowRoot.querySelector('p');
+ assert.include(link.textContent, 'This issue has been migrated to Launch, see link in final comment below');
+ });
+});
\ No newline at end of file
diff --git a/static_src/reducers/issueV0.js b/static_src/reducers/issueV0.js
index 880ebbc..8f670c9 100644
--- a/static_src/reducers/issueV0.js
+++ b/static_src/reducers/issueV0.js
@@ -14,7 +14,7 @@
import {createSelector} from 'reselect';
import {autolink} from 'autolink.js';
import {fieldTypes, extractTypeForIssue,
- fieldValuesToMap} from 'shared/issue-fields.js';
+ fieldValuesToMap, migratedTypes} from 'shared/issue-fields.js';
import {removePrefix, objectToMap} from 'shared/helpers.js';
import {issueRefToString, issueToIssueRefString,
issueStringToRef, issueNameToRefString} from 'shared/convertersV0.js';
@@ -493,7 +493,9 @@
const RESTRICT_VIEW_PREFIX = 'restrict-view-';
const RESTRICT_EDIT_PREFIX = 'restrict-editissue-';
const RESTRICT_COMMENT_PREFIX = 'restrict-addissuecomment-';
-const MIGRATED_ISSUE_PREFIX = 'migrated-to-b-';
+const MIGRATED_ISSUE_PREFIX = 'migrated-to-';
+const MIGRATED_BUGANIZER_ISSUE_PREFIX = 'migrated-to-b-';
+const MIGRATED_LAUNCH_ISSUE_PREFIX = 'migrated-to-launch-';
/**
* Selector to retrieve all normalized Issue data in the Redux store,
@@ -704,22 +706,49 @@
},
);
-// Gets the Issue Tracker ID of a moved issue.
+// Gets the Issue Tracker or Launch ID of a moved issue.
export const migratedId = createSelector(
labelRefs,
(labelRefs) => {
if (!labelRefs) return '';
- // Assume that there's only one migrated-to-b-* label. Or at least drop any
+ // Assume that there's only one migrated-to-* label. Or at least drop any
+ // labels besides the first one.
+ const migrationLabel = labelRefs.find((labelRef) => {
+ return labelRef.label.toLowerCase().startsWith(MIGRATED_ISSUE_PREFIX);
+ });
+
+ if (migrationLabel) {
+ if (migrationLabel.label.toLowerCase().startsWith(MIGRATED_BUGANIZER_ISSUE_PREFIX)) {
+ return migrationLabel.label.substring(MIGRATED_BUGANIZER_ISSUE_PREFIX.length);
+ } else if (migrationLabel.label.toLowerCase().startsWith(MIGRATED_LAUNCH_ISSUE_PREFIX)) {
+ return migrationLabel.label.substring(MIGRATED_LAUNCH_ISSUE_PREFIX.length);
+ }
+ }
+ return '';
+ },
+);
+
+// Gets the Issue Migrated Type of a moved issue.
+export const migratedType = createSelector(
+ labelRefs,
+ (labelRefs) => {
+ if (!labelRefs) return migratedTypes.NONE;
+
+ // Assume that there's only one migrated-to-* label. Or at least drop any
// labels besides the first one.
const migrationLabel = labelRefs.find((labelRef) => {
return labelRef.label.toLowerCase().startsWith(MIGRATED_ISSUE_PREFIX);
});
if (migrationLabel) {
- return migrationLabel.label.substring(MIGRATED_ISSUE_PREFIX.length);
+ if (migrationLabel.label.toLowerCase().startsWith(MIGRATED_BUGANIZER_ISSUE_PREFIX)) {
+ return migratedTypes.BUGANIZER_TYPE;
+ } else if (migrationLabel.label.toLowerCase().startsWith(MIGRATED_LAUNCH_ISSUE_PREFIX)) {
+ return migratedTypes.LAUNCH_TYPE;
+ }
}
- return '';
+ return migratedTypes.NONE;
},
);
diff --git a/static_src/reducers/issueV0.test.js b/static_src/reducers/issueV0.test.js
index 33b63c1..b79cdb5 100644
--- a/static_src/reducers/issueV0.test.js
+++ b/static_src/reducers/issueV0.test.js
@@ -12,6 +12,7 @@
import {issueToIssueRef, issueRefToString} from 'shared/convertersV0.js';
import {prpcClient} from 'prpc-client-instance.js';
import {getSigninInstance} from 'shared/gapi-loader.js';
+import {migratedTypes} from 'shared/issue-fields.js';
let prpcCall;
let dispatch;
@@ -323,6 +324,48 @@
{label: 'migrated-to-b-1234'},
{label: 'migrated-to-b-6789'},
]})), '1234');
+
+ assert.equal(issueV0.migratedId(wrapIssue({labelRefs: [
+ {label: 'IgnoreThis'},
+ {label: 'IgnoreThis2'},
+ {label: 'migrated-to-launch-6789'},
+ ]})), '6789');
+
+ assert.equal(issueV0.migratedId(wrapIssue({labelRefs: [
+ {label: 'migrated-to-launch-1234'},
+ ]})), '1234');
+
+ // We assume there's only one migrated-to-* label.
+ assert.equal(issueV0.migratedId(wrapIssue({labelRefs: [
+ {label: 'migrated-to-launch-1234'},
+ {label: 'migrated-to-b-6789'},
+ ]})), '1234');
+ });
+
+ it('migratedType', () => {
+ assert.equal(issueV0.migratedType(wrapIssue()), migratedTypes.NONE);
+ assert.equal(issueV0.migratedType(wrapIssue({labelRefs: []})), migratedTypes.NONE);
+
+ assert.equal(issueV0.migratedType(wrapIssue({labelRefs: [
+ {label: 'IgnoreThis'},
+ {label: 'IgnoreThis2'},
+ ]})), migratedTypes.NONE);
+
+ assert.equal(issueV0.migratedType(wrapIssue({labelRefs: [
+ {label: 'IgnoreThis'},
+ {label: 'IgnoreThis2'},
+ {label: 'migrated-to-b-6789'},
+ ]})), migratedTypes.BUGANIZER_TYPE);
+
+ assert.equal(issueV0.migratedType(wrapIssue({labelRefs: [
+ {label: 'migrated-to-launch-1234'},
+ ]})), migratedTypes.LAUNCH_TYPE);
+
+ // We assume there's only one migrated-to-b-* label.
+ assert.equal(issueV0.migratedType(wrapIssue({labelRefs: [
+ {label: 'migrated-to-launch-1234'},
+ {label: 'migrated-to-b-6789'},
+ ]})), migratedTypes.LAUNCH_TYPE);
});
diff --git a/static_src/shared/issue-fields.js b/static_src/shared/issue-fields.js
index 09ac7d3..0acbe60 100644
--- a/static_src/shared/issue-fields.js
+++ b/static_src/shared/issue-fields.js
@@ -37,6 +37,13 @@
PROJECT_TYPE: 'PROJECT_TYPE',
});
+/** @enum {string} */
+export const migratedTypes = Object.freeze({
+ BUGANIZER_TYPE: 'BUGANIZER_TYPE',
+ LAUNCH_TYPE: 'LAUNCH_TYPE',
+ NONE: 'NONE',
+});
+
const GROUPABLE_FIELD_TYPES = new Set([
fieldTypes.DATE_TYPE,
fieldTypes.ENUM_TYPE,
diff --git a/static_src/shared/md-helper.js b/static_src/shared/md-helper.js
index fdceebc..8a22b0d 100644
--- a/static_src/shared/md-helper.js
+++ b/static_src/shared/md-helper.js
@@ -38,7 +38,7 @@
const SANITIZE_OPTIONS = Object.freeze({
RETURN_TRUSTED_TYPE: true,
FORBID_TAGS: ['style'],
- FORBID_ATTR: ['style', 'autoplay'],
+ FORBID_ATTR: ['style', 'autoplay', 'src'],
});
/**
@@ -50,31 +50,6 @@
return raw.replace(/<b>|<\/b>/g, '**');
};
-/** @const {Object} Basic HTML character escape mapping */
-const HTML_ESCAPE_MAP = Object.freeze({
- '&': '&',
- '<': '<',
- '>': '>',
- '"': '"',
- '\'': ''',
- '/': '/',
- '`': '`',
- '=': '=',
-});
-
-/**
- * Escapes HTML characters, used to render HTML blocks in Markdown. This
- * alleviates security flaws but is not the primary security barrier, that is
- * handled by DOMPurify.
- * @param {string} text Content that looks to Marked parser to contain HTML.
- * @return {string} Same text content after escaping HTML characters.
- */
-const escapeHtml = (text) => {
- return text.replace(/[<>"']/g, (s) => {
- return HTML_ESCAPE_MAP[s];
- });
-};
-
/**
* Checks to see if input string is a valid HTTP link.
* @param {string} string
@@ -139,8 +114,7 @@
// autolinking.
// TODO(crbug.com/monorail/9310): Integrate autolink
const preprocessed = replaceBoldTag(raw);
- const escaped = escapeHtml(preprocessed);
- const converted = marked(escaped);
+ const converted = marked(preprocessed);
const sanitized = DOMPurify.sanitize(converted, SANITIZE_OPTIONS);
return sanitized.toString();
};
diff --git a/static_src/shared/md-helper.test.js b/static_src/shared/md-helper.test.js
index 6056849..52ba279 100644
--- a/static_src/shared/md-helper.test.js
+++ b/static_src/shared/md-helper.test.js
@@ -104,22 +104,8 @@
assert.equal(actual, expected);
});
- it('escapes HTML content', () => {
- let actual = renderMarkdown('<input></input>');
- assert.equal(actual, '<p><input></input></p>\n');
-
- actual = renderMarkdown('<a href="https://google.com">clickme</a>');
- assert.equal(actual,
- `<p><a href="<span class="annotated-link"><a title="" ` +
- `href="https://google.com">clickme</a"><span ` +
- `class="material-icons link_off">link_off</span>` +
- `https://google.com">clickme</a</a><span ` +
- `class="tooltip">Link may be malformed: ` +
- `https://google.com">clickme</a</span></span>></p>\n`);
- });
-
- it('escapes video content', () => {
- const actual = renderMarkdown('<video src="//youtube" control></video>');
- assert.equal(actual, '<p><video src="//youtube" control></video></p>\n');
+ it('forbids src', () => {
+ const actual = renderMarkdown('<video id="foo" src="//youtube" control></video>');
+ assert.equal(actual, '<p><video id="foo"></video></p>\n');
});
});
diff --git a/testing/fake.py b/testing/fake.py
index 33d5ed0..f484b12 100644
--- a/testing/fake.py
+++ b/testing/fake.py
@@ -2864,6 +2864,10 @@
"""Return all values, assume that the value at key is already a list."""
return self.dictionary.get(key, [])
+ def getlist(self, key):
+ """Return all values, assume that the value at key is already a list."""
+ return self.dictionary.get(key, [])
+
def get(self, key, default=None):
"""Return first value, assume that the value at key is already a list."""
return self.dictionary.get(key, [default])[0]
diff --git a/tracker/fltconversion.py b/tracker/fltconversion.py
index e42b432..63a7e63 100644
--- a/tracker/fltconversion.py
+++ b/tracker/fltconversion.py
@@ -134,8 +134,7 @@
'phase_map, approvals_to_labels, labels_re')
-# TODO: change to FlaskInternalTask when convert to flask
-class FLTConvertTask(jsonfeed.InternalTask):
+class FLTConvertTask(jsonfeed.FlaskInternalTask):
"""FLTConvert converts current Type=Launch issues into Type=FLT-Launch."""
def AssertBasePermission(self, mr):
@@ -531,11 +530,8 @@
return tracker_bizobj.MakeFieldValue(
field_id, None, None, user_id, None, None, False)
- # def GetFLTConvertTask(self, **kwargs):
- # return self.handler(**kwargs)
-
- # def PostFLTConvertTask(self, **kwargs):
- # return self.handler(**kwargs)
+ def PostFLTConvertTask(self, **kwargs):
+ return self.handler(**kwargs)
def ConvertMLabels(
diff --git a/tracker/test/componentdetail_test.py b/tracker/test/componentdetail_test.py
index 18886bc..6186582 100644
--- a/tracker/test/componentdetail_test.py
+++ b/tracker/test/componentdetail_test.py
@@ -12,7 +12,10 @@
from mock import Mock, patch
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
from features import filterrules_helpers
from framework import permissions
diff --git a/tracker/test/fieldcreate_test.py b/tracker/test/fieldcreate_test.py
index 580d095..4a8c919 100644
--- a/tracker/test/fieldcreate_test.py
+++ b/tracker/test/fieldcreate_test.py
@@ -8,7 +8,10 @@
from __future__ import division
from __future__ import absolute_import
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
import mock
import unittest
import logging
diff --git a/tracker/test/fielddetail_test.py b/tracker/test/fielddetail_test.py
index f9f27b4..5b31fa3 100644
--- a/tracker/test/fielddetail_test.py
+++ b/tracker/test/fielddetail_test.py
@@ -8,7 +8,10 @@
from __future__ import division
from __future__ import absolute_import
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
import unittest
import logging
diff --git a/tracker/test/fltconversion_test.py b/tracker/test/fltconversion_test.py
index e0bae41..be66ad2 100644
--- a/tracker/test/fltconversion_test.py
+++ b/tracker/test/fltconversion_test.py
@@ -33,8 +33,7 @@
config=fake.ConfigService(),
template=mock.Mock(spec=template_svc.TemplateService),)
self.mr = testing_helpers.MakeMonorailRequest()
- self.task = fltconversion.FLTConvertTask(
- 'req', 'res', services=self.services)
+ self.task = fltconversion.FLTConvertTask(services=self.services)
self.task.mr = self.mr
self.issue = fake.MakeTestIssue(
789, 1, 'summary', 'New', 111, issue_id=78901)
diff --git a/tracker/test/issueadmin_test.py b/tracker/test/issueadmin_test.py
index 751e414..c36a495 100644
--- a/tracker/test/issueadmin_test.py
+++ b/tracker/test/issueadmin_test.py
@@ -8,7 +8,10 @@
from __future__ import division
from __future__ import absolute_import
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
import unittest
from mock import Mock, patch
diff --git a/tracker/test/issueattachment_test.py b/tracker/test/issueattachment_test.py
index 9fccba5..f782f22 100644
--- a/tracker/test/issueattachment_test.py
+++ b/tracker/test/issueattachment_test.py
@@ -12,7 +12,10 @@
from google.appengine.ext import testbed
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
import webapp2
from framework import gcs_helpers
diff --git a/tracker/test/issuedetailezt_test.py b/tracker/test/issuedetailezt_test.py
index d3b8327..fe3d22f 100644
--- a/tracker/test/issuedetailezt_test.py
+++ b/tracker/test/issuedetailezt_test.py
@@ -10,7 +10,10 @@
import logging
import mock
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
import time
import unittest
diff --git a/tracker/test/issueentry_test.py b/tracker/test/issueentry_test.py
index 4a64d7c..b7461ae 100644
--- a/tracker/test/issueentry_test.py
+++ b/tracker/test/issueentry_test.py
@@ -8,7 +8,10 @@
from __future__ import division
from __future__ import absolute_import
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
import time
import unittest
diff --git a/tracker/test/issuereindex_test.py b/tracker/test/issuereindex_test.py
index fe033b8..715da9a 100644
--- a/tracker/test/issuereindex_test.py
+++ b/tracker/test/issuereindex_test.py
@@ -10,7 +10,10 @@
import unittest
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
import settings
from framework import permissions
diff --git a/tracker/test/templatecreate_test.py b/tracker/test/templatecreate_test.py
index 60db78b..78664c0 100644
--- a/tracker/test/templatecreate_test.py
+++ b/tracker/test/templatecreate_test.py
@@ -8,7 +8,10 @@
from __future__ import division
from __future__ import absolute_import
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
import unittest
import settings
diff --git a/tracker/test/templatedetail_test.py b/tracker/test/templatedetail_test.py
index 607996a..42fc46b 100644
--- a/tracker/test/templatedetail_test.py
+++ b/tracker/test/templatedetail_test.py
@@ -8,7 +8,10 @@
from __future__ import division
from __future__ import absolute_import
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
import logging
import unittest
import settings
diff --git a/tracker/test/tracker_views_test.py b/tracker/test/tracker_views_test.py
index 797b079..ddc2a3e 100644
--- a/tracker/test/tracker_views_test.py
+++ b/tracker/test/tracker_views_test.py
@@ -11,7 +11,10 @@
import logging
import unittest
-import mox
+try:
+ from mox3 import mox
+except ImportError:
+ import mox
from google.appengine.api import app_identity
import ezt