Compare commits

...

97 commits

Author SHA1 Message Date
Benjamin Renard 296618a34e
config: allow to customize default config file mode
Some checks failed
Run tests / tests (push) Failing after 1m0s
2024-04-16 10:39:55 +02:00
Benjamin Renard 28103836ac config: Add ask_value() helper to section & config objects
Some checks failed
Run tests / tests (push) Failing after 57s
2024-04-13 18:30:15 +02:00
Benjamin Renard 3cf6a2682c opening_hours: global rework to make module more atomic and add somes new helper methods
Some checks failed
Run tests / tests (push) Failing after 1m35s
2024-04-01 19:37:51 +02:00
Benjamin Renard eb87516e1a Docker: upgrade images and based them on node:16-bookworm-slim to allow to use it with Forgejo Actions
All checks were successful
Run tests / tests (push) Successful in 2m15s
2024-03-15 12:01:45 +01:00
Benjamin Renard 5dbdb0ffe6 CI: add missing publish jobs dependency on build job 2024-03-15 11:34:07 +01:00
Benjamin Renard b45819428d Switch from Woodpecker CI to Forgejo Actions
All checks were successful
Run tests / tests (push) Successful in 3m44s
2024-03-15 10:46:33 +01:00
Benjamin Renard 85caf81ac2 Introduce some new pre-commit hooks 2024-03-15 10:23:21 +01:00
Benjamin Renard 09c422efe2
Fix including test email template
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2024-03-01 16:34:45 +01:00
Benjamin Renard e368521a96
config: add _set_option() method to ConfigurableObject 2024-01-03 15:12:52 +01:00
Benjamin Renard 25cdf9d4dc
config: Add logging sections in __init__() to allow to set their default values 2023-12-19 18:36:09 +01:00
Benjamin Renard 4962b16099
config: fix configure() method to validate configuration only if -V/--validate parameter is provided 2023-12-15 13:41:53 +01:00
Benjamin Renard 371d194728
PgDB: fix doSelect() method to retreive list of dicts instead of list of lists. 2023-12-15 12:12:48 +01:00
Benjamin Renard dcaec24ea4
PgDB / MyDB / OracleDB: add limit parameter to select() method 2023-12-15 11:35:43 +01:00
Benjamin Renard 2736fc30ae
report: add add_logging_handler & send_at_exit parameters 2023-12-14 21:41:16 +01:00
Benjamin Renard 73795d27b8
config: add default_config_filename parameter 2023-12-14 21:25:00 +01:00
Benjamin Renard 07ab4490d2
config: Add OctalOption 2023-12-14 17:24:59 +01:00
Benjamin Renard 68c2103c58
config: add console log_level parameter
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2023-10-30 14:04:43 +01:00
Benjamin Renard 0064fa979c
config: fix python 3.9 compatibility
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2023-10-27 13:43:55 +02:00
Benjamin Renard b92a814577
config: Add logfile feature
Some checks failed
ci/woodpecker/push/woodpecker Pipeline failed
2023-10-27 13:36:32 +02:00
Benjamin Renard 8a0a65465d
Update pre-commit config and fix some pylint & bandit warnings 2023-10-27 13:35:59 +02:00
Benjamin Renard 8e0e75f30e
setup.py: use README.md file as long_description (required by latest stdeb lib) 2023-07-10 16:04:56 +02:00
Benjamin Renard 14d82fe796
build.sh: exclude pre-commit commits on computing Debian changelog 2023-07-10 14:19:41 +02:00
Benjamin Renard 698fd52a03
build.sh: add some commit exclusion regex on computing Debian changelog 2023-07-10 13:07:34 +02:00
Benjamin Renard 71a49f7b2f
build.sh: fix python package version 2023-07-10 13:06:01 +02:00
Benjamin Renard e38e5b10a7
build.sh: fix dpkg-source / quilt 'version does not contain a revision' error 2023-07-10 12:47:33 +02:00
Benjamin Renard 3a443e1fa5
build.sh: fix handling git safe protection 2023-07-10 12:46:44 +02:00
Benjamin Renard 44bd9a6446
Telltale: add check_entrypoint() to easily implement Icinga checker script 2023-07-10 11:56:03 +02:00
Benjamin Renard f8602801d7
Fix some pylint warnings 2023-07-10 11:55:09 +02:00
Benjamin Renard e8572e2eaa
pre-commit: bump to isort 5.11.5 2023-07-10 11:53:42 +02:00
Benjamin Renard f597164305
Config: add stuff to handle just-try mode in ConfigurableObject class 2023-06-19 17:07:59 +02:00
Benjamin Renard 72877dd13e
LdapClient.get_changes(): properly handle attributes with empty value 2023-05-03 11:35:30 +02:00
Benjamin Renard ebd73812bc
ldap.LdapServer: add format_modify_modlist() and factorize format_changes() & update_need() 2023-05-03 11:14:18 +02:00
Benjamin Renard 5693cf8f8a
Re-ordering pre-commits hooks to run those who change the code first 2023-03-23 09:59:05 +01:00
Benjamin Renard 63d6a6e0ed
Intruce bandit pre-commit checks 2023-03-23 09:56:31 +01:00
Benjamin Renard 73735b378f
Email: add support for CC & BCC recipients 2023-03-14 17:12:50 +01:00
Benjamin Renard c93b3508ed
Email: add possibility to specify more than one recipient 2023-03-14 16:55:36 +01:00
Benjamin Renard d75a61b4e8
Email: fix handling templates_path default value 2023-03-13 19:13:52 +01:00
Benjamin Renard 93b06d6127
ConfigurableObject: add set_default() / set_defaults() methods 2023-03-13 19:12:47 +01:00
Benjamin Renard e71fb28295
Email: add possibility to easily load templates from a directory 2023-03-13 18:58:20 +01:00
Benjamin Renard b5df95a2dd
Config: add set_default() / set_defaults() methods 2023-03-13 18:53:29 +01:00
Benjamin Renard 5aa6a0cea4
LdapClient.update_object(): add relax parameter 2023-03-13 17:02:52 +01:00
Benjamin Renard 3efaceb823 Fix tests.sh exit code
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2023-01-16 13:37:53 +01:00
Benjamin Renard 56f66dcd6e Run pytest only when at least one python file is changed 2023-01-16 13:37:17 +01:00
Benjamin Renard 86e1d59b1d Improve docker images and use brenard/mylib:dev-master to run tests quickly
Some checks failed
ci/woodpecker/push/woodpecker Pipeline failed
2023-01-16 13:24:19 +01:00
Benjamin Renard 62c3fadf96 Introduce pyupgrade,isort,black and configure pre-commit hooks to run all testing tools before commit
Some checks failed
ci/woodpecker/push/woodpecker Pipeline failed
2023-01-16 12:56:12 +01:00
Benjamin Renard a83c3d635f Add mylib.mapping.map_hash() 2023-01-16 12:23:50 +01:00
Benjamin Renard 69d6a596a8 Docker: try to make images lighter
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2023-01-10 11:22:39 +01:00
Benjamin Renard 3a43b8d07d Config.parse_arguments_options(): add hardcoded_options argument
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2023-01-10 10:52:26 +01:00
Benjamin Renard 56162452ac config: if option value was previously set, ignore from options value 2023-01-10 10:50:48 +01:00
Benjamin Renard 5fefc1ed84 Config: make the configparser always defined to allow to set options at any moment 2023-01-10 10:48:57 +01:00
Benjamin Renard e9477b1566 config: add optional --reconfigure parameter
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2023-01-09 13:34:07 +01:00
Benjamin Renard 508a28e5c8 tests: add some tests on BooleanOption 2023-01-09 13:33:14 +01:00
Benjamin Renard 6a7368deb5 ConfigOption: add _get_user_input method to allow to mock it in tests 2023-01-09 13:32:25 +01:00
Benjamin Renard 014a0802f8 config: make sure to reload file after saving it
To ensure the configparser object is uptodate.
2023-01-09 13:27:52 +01:00
Benjamin Renard 135a742b6e config: add --console-stderr parameter
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2023-01-09 11:25:58 +01:00
Benjamin Renard da63f533be Abstract common DB methods in mysql.db.DB class and use it as base to implement PgDB, OracleDB and MyDB
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2023-01-07 02:19:18 +01:00
Benjamin Renard 83ce6b9d1b CI now run pylint & flake8
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2023-01-06 22:18:46 +01:00
Benjamin Renard b0fd2b1c6d Code cleaning 2023-01-06 22:18:18 +01:00
Benjamin Renard 815081d544 PgDB: Fix tests
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2023-01-06 21:10:25 +01:00
Benjamin Renard 01d71fef89 Add doc about pip install method
Some checks failed
ci/woodpecker/push/woodpecker Pipeline failed
2023-01-06 19:38:39 +01:00
Benjamin Renard ad04357c6b Code cleaning 2023-01-06 19:38:32 +01:00
Benjamin Renard eb183b0d3b config: split console logging between stdout & stderr base on level
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2023-01-06 18:19:48 +01:00
Benjamin Renard 55782df47c config: code cleaning 2023-01-06 18:19:48 +01:00
Benjamin Renard 7efacf04e4 Add the Dockerfile to build the docker brenard/mylib:dev-master image
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2023-01-03 00:35:22 +01:00
Benjamin Renard a1118cc40a Add the Dockerfile to build the docker brenard/mylib:latest image 2023-01-03 00:21:44 +01:00
Benjamin Renard cb4b8d6974 ldap: add option to disable referral following
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2022-12-09 12:33:09 +01:00
Benjamin Renard c643fd30ac oracle: fix closing cursors 2022-12-09 12:20:01 +01:00
Benjamin Renard 651e1a1a6c ldap: Work-arround with AD invalid return on searching objects
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2022-12-02 12:09:40 +01:00
Benjamin Renard 85d34b7b9a Config: allow access/setting config options as with a dict 2022-11-09 10:25:13 +01:00
Benjamin Renard fd2911e810 CI: fix path of deb files in publish-apt job
All checks were successful
ci/woodpecker/tag/woodpecker Pipeline was successful
ci/woodpecker/push/woodpecker Pipeline was successful
2022-08-02 01:42:04 +02:00
Benjamin Renard cb9146f6e9 CI: fix release notes file path
Some checks failed
ci/woodpecker/push/woodpecker Pipeline failed
ci/woodpecker/tag/woodpecker Pipeline failed
2022-08-02 01:37:37 +02:00
Benjamin Renard d0676443d7 CI: use brenard/aptly-publish image for publish-apt job
Some checks failed
ci/woodpecker/push/woodpecker Pipeline failed
ci/woodpecker/tag/woodpecker Pipeline failed
2022-08-02 01:33:52 +02:00
Benjamin Renard 68729f301f CI: add release notes 2022-08-02 01:30:25 +02:00
Benjamin Renard 31eeff367c ldap.LdapClient.get_objects: add paged_search & pagesize parameters
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2022-07-20 10:52:10 +02:00
Benjamin Renard 86eae5dae5 ldap.LdapServer: make parameters accepted by search and paged_search methods identical 2022-07-20 10:50:58 +02:00
Benjamin Renard 6adcc1eeed SFTP client: add missing connect() call in get_file and open_file methods
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2022-06-30 14:46:43 +02:00
Benjamin Renard e858cb3d71 SFTP client: add missing docstrings
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2022-06-30 14:36:51 +02:00
Benjamin Renard 54d6c6e0f3 SFTP client: add get_file and open_file methods
All checks were successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2022-06-30 14:34:25 +02:00
Benjamin Renard 9511b31a79 Add SFTP client
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2022-06-28 11:05:43 +02:00
Benjamin Renard b80cc3b3b6 Fix typo
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2022-06-23 18:56:58 +02:00
Benjamin Renard f541630a63 ldap: code cleaning / fix pylint/flake8 warnings
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2022-06-23 18:38:21 +02:00
Benjamin Renard 2bc9964b12 ldap.LdapServer: Add encode/decode helpers and parameters 2022-06-23 18:34:48 +02:00
Benjamin Renard fe3e3ed5f4 ldap: fix DN spliting/escaping problems
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2022-06-07 12:40:53 +02:00
Benjamin Renard cbb97ae726 LdapClient.update_need: fix handling None changes
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2022-06-01 18:46:37 +02:00
Benjamin Renard 025fd12dc4 ldap: add parameter to disable SSL certificate check 2022-06-01 18:46:08 +02:00
Benjamin Renard e8de509346 Report: add methods to attach file/payload
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2022-05-27 19:59:47 +02:00
Benjamin Renard 5a7a46355c LdapClient.update_object: do not modify the provided changes parameter in case of renaming 2022-05-27 19:54:03 +02:00
Benjamin Renard a36ce4070b LdapClient.add_object: ignore 'dn' attribute if provided 2022-05-27 19:52:59 +02:00
Benjamin Renard be80b1ed8c report: make Report compatible with mylib.config.Config
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2022-05-27 17:09:51 +02:00
Benjamin Renard 6ac1216ed8 LdapClient: improve just-try mode detection
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2022-05-27 15:29:16 +02:00
Benjamin Renard 5f0527f0c3 config: add get_option() method 2022-05-27 15:15:34 +02:00
Benjamin Renard ade97bc90f LdapClient: init cache in __init__ method and reset it during initialization
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
Fix cache sharing problem when multiple clients are used on same soft
2022-05-27 14:11:40 +02:00
Benjamin Renard e7e07a944a LdapClient: replace private __get_option method by protected _get_option
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
To make call by inherited class object easier.
2022-05-27 12:53:11 +02:00
Benjamin Renard 3bf87222fd config: start adding support of multiline values
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2022-05-13 14:47:01 +02:00
Benjamin Renard 2cdc7b870d CI: fix publishing
Some checks failed
ci/woodpecker/push/woodpecker Pipeline failed
ci/woodpecker/tag/woodpecker Pipeline was successful
2022-05-01 00:01:38 +02:00
Benjamin Renard 2927065ed6 build.sh: Fix escaping gitdch calling args
Some checks failed
ci/woodpecker/push/woodpecker Pipeline failed
ci/woodpecker/tag/woodpecker Pipeline failed
2022-04-30 23:44:40 +02:00
Benjamin Renard bf37e6223e Add debian publishing
Some checks failed
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline failed
Use brenard/debian-python-deb docker image to speedup pipeline
2022-04-30 22:59:19 +02:00
48 changed files with 6255 additions and 2818 deletions

View file

@ -0,0 +1,86 @@
---
name: Build and publish Debian & Python packages
on: ["create"]
jobs:
build:
runs-on: docker
container:
image: docker.io/brenard/debian-python-deb:latest
steps:
- name: Check out repository code
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Build Debian & Python package
env:
MAINTAINER_NAME: ${{ vars.MAINTAINER_NAME }}
MAINTAINER_EMAIL: ${{ vars.MAINTAINER_EMAIL }}
DEBIAN_CODENAME: ${{ vars.DEBIAN_CODENAME }}
run: |
echo "${{ secrets.GPG_KEY }}"|base64 -d|gpg --import
./build.sh
rm -fr deb_dist/mylib-*
- name: Upload Debian & Python package files
uses: actions/upload-artifact@v3
with:
name: dist
path: |
dist
deb_dist
publish-forgejo:
runs-on: docker
container:
image: docker.io/brenard/debian-python-deb:latest
needs: build
steps:
- name: Download Debian & Python packages files
uses: actions/download-artifact@v3
with:
name: dist
- name: Create the release
id: create-release
shell: bash
run: |
mkdir release
mv dist/*.whl dist/*.tar.gz release/
mv deb_dist/*.deb release/
md5sum release/* > md5sum.txt
sha512sum release/* > sha512sum.txt
mv md5sum.txt sha512sum.txt release/
{
echo 'release_note<<EOF'
cat dist/release_notes.md
echo 'EOF'
} >> "$GITHUB_OUTPUT"
- name: Publish release on Forgejo
uses: actions/forgejo-release@v1
with:
direction: upload
url: https://gitea.zionetrix.net
token: ${{ secrets.forgejo_token }}
release-dir: release
release-notes: ${{ steps.create-release.outputs.release_note }}
publish-aptly:
runs-on: docker
container:
image: docker.io/brenard/aptly-publish:latest
needs: build
steps:
- name: "Download Debian package files"
uses: actions/download-artifact@v3
with:
name: dist
- name: "Publish Debian package on Aptly repository"
uses: https://gitea.zionetrix.net/bn8/aptly-publish@master
with:
api_url: ${{ vars.apt_api_url }}
api_username: ${{ vars.apt_api_username }}
api_password: ${{ secrets.apt_api_password }}
repo_name: ${{ vars.apt_repo_name }}
path: "deb_dist"
source_name: ${{ vars.apt_source_name }}

View file

@ -0,0 +1,14 @@
---
name: Run tests
on: [push]
jobs:
tests:
runs-on: docker
container:
image: docker.io/brenard/mylib:dev-master
options: "--workdir /src"
steps:
- name: Check out repository code
uses: actions/checkout@v4
- name: Run tests.sh
run: ./tests.sh --no-venv

71
.pre-commit-config.yaml Normal file
View file

@ -0,0 +1,71 @@
# Pre-commit hooks to run tests and ensure code is cleaned.
# See https://pre-commit.com for more information
---
repos:
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.1.6
hooks:
- id: ruff
args: ["--fix"]
- repo: https://github.com/asottile/pyupgrade
rev: v3.15.0
hooks:
- id: pyupgrade
args: ["--keep-percent-format", "--py37-plus"]
- repo: https://github.com/psf/black
rev: 23.11.0
hooks:
- id: black
args: ["--target-version", "py37", "--line-length", "100"]
- repo: https://github.com/PyCQA/isort
rev: 5.12.0
hooks:
- id: isort
args: ["--profile", "black", "--line-length", "100"]
- repo: https://github.com/PyCQA/flake8
rev: 6.1.0
hooks:
- id: flake8
args: ["--max-line-length=100"]
- repo: https://github.com/codespell-project/codespell
rev: v2.2.2
hooks:
- id: codespell
args:
- --ignore-words-list=exten
- --skip="./.*,*.csv,*.json,*.ini,*.subject,*.txt,*.html,*.log,*.conf"
- --quiet-level=2
- --ignore-regex=.*codespell-ignore$
# - --write-changes # Uncomment to write changes
exclude_types: [csv, json]
- repo: https://github.com/adrienverge/yamllint
rev: v1.32.0
hooks:
- id: yamllint
ignore: .github/
- repo: https://github.com/pre-commit/mirrors-prettier
rev: v2.7.1
hooks:
- id: prettier
args: ["--print-width", "100"]
- repo: local
hooks:
- id: pylint
name: pylint
entry: ./.pre-commit-pylint --extension-pkg-whitelist=cx_Oracle
language: system
types: [python]
require_serial: true
- repo: https://github.com/PyCQA/bandit
rev: 1.7.5
hooks:
- id: bandit
args: [--skip, "B101", --recursive, "mylib"]
- repo: local
hooks:
- id: pytest
name: pytest
entry: ./.pre-commit-pytest tests
language: system
types: [python]
pass_filenames: false

21
.pre-commit-pylint Executable file
View file

@ -0,0 +1,21 @@
#!/bin/bash
PWD=`pwd`
if [ -d "$PWD/venv" ]
then
echo "Run pylint inside venv ($PWD/venv)..."
[ ! -e "$PWD/venv/bin/pylint" ] && $PWD/venv/bin/python -m pip install pylint
$PWD/venv/bin/pylint "$@"
exit $?
elif [ -e "$PWD/pyproject.toml" ]
then
echo "Run pylint using poetry..."
poetry run pylint --version > /dev/null 2>&1 || poetry run python -m pip install pylint
poetry run pylint "$@"
exit $?
else
echo "Run pylint at system scope..."
pylint "$@"
exit $?
fi

21
.pre-commit-pytest Executable file
View file

@ -0,0 +1,21 @@
#!/bin/bash
PWD=`pwd`
if [ -d "$PWD/venv" ]
then
echo "Run pytest inside venv ($PWD/venv)..."
[ ! -e "$PWD/venv/bin/pytest" ] && $PWD/venv/bin/python -m pip install pytest
$PWD/venv/bin/pytest "$@"
exit $?
elif [ -e "$PWD/pyproject.toml" ]
then
echo "Run pytest using poetry..."
poetry run pytest --version > /dev/null 2>&1 || poetry run python -m pip install pytest
poetry run pytest "$@"
exit $?
else
echo "Run pytest at system scope..."
pytest "$@"
exit $?
fi

View file

@ -3,4 +3,15 @@ disable=invalid-name,
locally-disabled,
too-many-arguments,
too-many-branches,
line-too-long,
too-many-locals,
too-many-return-statements,
too-many-nested-blocks,
too-many-instance-attributes,
too-many-lines,
too-many-statements,
logging-too-many-args,
duplicate-code,
[FORMAT]
# Maximum number of characters on a single line.
max-line-length=100

View file

@ -1,51 +0,0 @@
clone:
git:
image: woodpeckerci/plugin-git
tags: true
pipeline:
test:
image: debian
commands:
- DEBIAN_FRONTEND=noninteractive apt-get -qq update < /dev/null > /dev/null
- DEBIAN_FRONTEND=noninteractive apt-get -y -qq upgrade < /dev/null > /dev/null
- DEBIAN_FRONTEND=noninteractive apt-get -qq -y install --no-install-recommends python3-all python3-dev python3-pip python3-venv build-essential pkg-config libsystemd-dev libldap2-dev libsasl2-dev libpq-dev libmariadb-dev wget unzip < /dev/null > /dev/null
- wget --no-verbose -O /opt/instantclient-basic-linux.x64-21.4.0.0.0dbru.zip https://download.oracle.com/otn_software/linux/instantclient/214000/instantclient-basic-linux.x64-21.4.0.0.0dbru.zip
- unzip -qq -d /opt /opt/instantclient-basic-linux.x64-21.4.0.0.0dbru.zip
- echo /opt/instantclient_* > /etc/ld.so.conf.d/oracle-instantclient.conf
- ldconfig
- ./tests.sh --quiet
build:
image: debian
when:
event: tag
commands:
- DEBIAN_FRONTEND=noninteractive apt-get -qq update < /dev/null > /dev/null
- DEBIAN_FRONTEND=noninteractive apt-get -y -qq upgrade < /dev/null > /dev/null
- DEBIAN_FRONTEND=noninteractive apt-get -qq -y install --no-install-recommends git sed python3-all python3-dev python3-pip python3-venv dpkg-dev build-essential debhelper dh-python bash-completion lsb-release < /dev/null > /dev/null
- ./build.sh --quiet
- rm -fr deb_dist/mylib-*
publish-dryrun:
image: debian
when:
event: tag
commands:
- ls dist/*
- ls deb_dist/*
publish:
image: plugins/gitea-release
when:
event: tag
settings:
api_key:
from_secret: gitea_token
base_url: https://gitea.zionetrix.net
files:
- dist/*
- deb_dist/*
checksum:
- md5
- sha512

View file

@ -1,174 +0,0 @@
#!/usr/bin/python
# -*- coding: utf-8 -*-
#
# My hash mapping library
#
# Mapping configuration
# {
# '[dst key 1]': { # Key name in the result
#
# 'order': [int], # Processing order between destinations keys
#
# # Source values
# 'other_key': [key], # Other key of the destination to use as source of values
# 'key' : '[src key]', # Key of source hash to get source values
# 'keys' : ['[sk1]', '[sk2]', ...], # List of source hash's keys to get source values
#
# # Clean / convert values
# 'cleanRegex': '[regex]', # Regex that be use to remove unwanted characters. Ex : [^0-9+]
# 'convert': [function], # Function to use to convert value : Original value will be passed
# # as argument and the value retrieve will replace source value in
# # the result
# # Ex :
# # lambda x: x.strip()
# # lambda x: "myformat : %s" % x
# # Deduplicate / check values
# 'deduplicate': [bool], # If True, sources values will be depluplicated
# 'check': [function], # Function to use to check source value : Source value will be passed
# # as argument and if function return True, the value will be preserved
# # Ex :
# # lambda x: x in my_global_hash
# # Join values
# 'join': '[glue]', # If present, sources values will be join using the "glue"
#
# # Alternative mapping
# 'or': { [map configuration] } # If this mapping case does not retreive any value, try to get value(s)
# # with this other mapping configuration
# },
# '[dst key 2]': {
# [...]
# }
# }
#
# Return format :
# {
# '[dst key 1]': ['v1','v2', ...],
# '[dst key 2]': [ ... ],
# [...]
# }
import logging, re
def clean_value(value):
if isinstance(value, int):
value=str(value)
return value.encode('utf8')
def map(map_keys,src,dst={}):
def get_values(dst_key,src,m):
# Extract sources values
values=[]
if 'other_key' in m:
if m['other_key'] in dst:
values=dst[m['other_key']]
if 'key' in m:
if m['key'] in src and src[m['key']]!='':
values.append(clean_value(src[m['key']]))
if 'keys' in m:
for key in m['keys']:
if key in src and src[key]!='':
values.append(clean_value(src[key]))
# Clean and convert values
if 'cleanRegex' in m and len(values)>0:
new_values=[]
for v in values:
nv=re.sub(m['cleanRegex'],'',v)
if nv!='':
new_values.append(nv)
values=new_values
if 'convert' in m and len(values)>0:
new_values=[]
for v in values:
nv=m['convert'](v)
if nv!='':
new_values.append(nv)
values=new_values
# Deduplicate values
if m.get('deduplicate') and len(values)>1:
new_values=[]
for v in values:
if v not in new_values:
new_values.append(v)
values=new_values
# Check values
if 'check' in m and len(values)>0:
new_values=[]
for v in values:
if m['check'](v):
new_values.append(v)
else:
logging.debug('Invalid value %s for key %s' % (v,dst_key))
if dst_key not in invalid_values:
invalid_values[dst_key]=[]
if v not in invalid_values[dst_key]:
invalid_values[dst_key].append(v)
values=new_values
# Join values
if 'join' in m and len(values)>1:
values=[m['join'].join(values)]
# Manage alternative mapping case
if len(values)==0 and 'or' in m:
values=get_values(dst_key,src,m['or'])
return values
for dst_key in sorted(map_keys.keys(), key=lambda x: map_keys[x]['order']):
values=get_values(dst_key,src,map_keys[dst_key])
if len(values)==0:
if 'required' in map_keys[dst_key] and map_keys[dst_key]['required']:
logging.debug('Destination key %s could not be filled from source but is required' % dst_key)
return False
continue
dst[dst_key]=values
return dst
if __name__ == '__main__':
logging.basicConfig(level=logging.DEBUG)
src={
'uid': 'hmartin',
'firstname': 'Martin',
'lastname': 'Martin',
'disp_name': 'Henri Martin',
'line_1': '3 rue de Paris',
'line_2': 'Pour Pierre',
'zip_text': '92 120',
'city_text': 'Montrouge',
'line_city': '92120 Montrouge',
'tel1': '01 00 00 00 00',
'tel2': '09 00 00 00 00',
'mobile': '06 00 00 00 00',
'fax': '01 00 00 00 00',
'email': 'H.MARTIN@GMAIL.COM',
}
map_c={
'uid': {'order': 0, 'key': 'uid','required': True},
'givenName': {'order': 1, 'key': 'firstname'},
'sn': {'order': 2, 'key': 'lastname'},
'cn': {'order': 3, 'key': 'disp_name','required': True, 'or': {'attrs': ['firstname','lastname'],'join': ' '}},
'displayName': {'order': 4, 'other_key': 'displayName'},
'street': {'order': 5, 'join': ' / ', 'keys': ['ligne_1','ligne_2']},
'postalCode': {'order': 6, 'key': 'zip_text', 'cleanRegex': '[^0-9]'},
'l': {'order': 7, 'key': 'city_text'},
'postalAddress': {'order': 8, 'join': '$', 'keys': ['ligne_1','ligne_2','ligne_city']},
'telephoneNumber': {'order': 9, 'keys': ['tel1','tel2'], 'cleanRegex': '[^0-9+]', 'deduplicate': True},
'mobile': {'order': 10,'key': 'mobile'},
'facsimileTelephoneNumber': {'order': 11,'key': 'fax'},
'mail': {'order': 12,'key': 'email', 'convert': lambda x: x.lower().strip()}
}
logging.debug('[TEST] Map src=%s / config= %s' % (src,map_c))
logging.debug('[TEST] Result : %s' % map(map_c,src))

View file

@ -27,21 +27,61 @@ apt install libmariadb-dev
## Installation
### Using pip
Just run `pip install git+https://gitea.zionetrix.net/bn8/python-mylib.git`
### From source
Just run `python setup.py install`
**Note:** This project could previously use as independent python files (not as module). This old version is keep in *legacy* git branch (not maintained).
**Note:** This project could previously use as independent python files (not as module). This old version is keep in _legacy_ git branch (not maintained).
## Include libs
* **mylib.email.EmailClient:** An email client to forge (eventually using template) and send email via a SMTP server
* **mylib.ldap.LdapServer:** A small lib to make requesting LDAP server easier. It's also provide some helper functions to deal with LDAP date string.
* **mylib.mysql.MyDB:** An extra small lib to remember me how to interact with MySQL/MariaDB database
* **mylib.pgsql.PgDB:** An small lib to remember me how to interact with PostgreSQL database. **Warning:** The insert/update/delete/select methods demonstrate how to forge raw SQL request, but **it's a bad idea**: Prefer using prepared query.
* **mylib.opening_hours:** A set of helper functions to deal with french opening hours (including normal opening hours, exceptional closure and nonworking public holidays).
* **mylib.pbar.Pbar:** A small lib for progress bar
* **mylib.report.Report:** A small lib to implement logging based email report send at exit
- **mylib.email.EmailClient:** An email client to forge (eventually using template) and send email via a SMTP server
- **mylib.ldap.LdapServer:** A small lib to make requesting LDAP server easier. It's also provide some helper functions to deal with LDAP date string.
- **mylib.mysql.MyDB:** An extra small lib to remember me how to interact with MySQL/MariaDB database
- **mylib.pgsql.PgDB:** An small lib to remember me how to interact with PostgreSQL database. **Warning:** The insert/update/delete/select methods demonstrate how to forge raw SQL request, but **it's a bad idea**: Prefer using prepared query.
- **mylib.opening_hours:** A set of helper functions to deal with french opening hours (including normal opening hours, exceptional closure and nonworking public holidays).
- **mylib.pbar.Pbar:** A small lib for progress bar
- **mylib.report.Report:** A small lib to implement logging based email report send at exit
To know how to use these libs, you can take a look on *mylib.scripts* content or in *tests* directory.
To know how to use these libs, you can take a look on _mylib.scripts_ content or in _tests_ directory.
## Code Style
[pylint](https://pypi.org/project/pylint/) is used to check for errors and enforces a coding standard, using those parameters:
```bash
pylint --extension-pkg-whitelist=cx_Oracle
```
[flake8](https://pypi.org/project/flake8/) is also used to check for errors and enforces a coding standard, using those parameters:
```bash
flake8 --max-line-length=100
```
[black](https://pypi.org/project/black/) is used to format the code, using those parameters:
```bash
black --target-version py37 --line-length 100
```
[isort](https://pypi.org/project/isort/) is used to format the imports, using those parameter:
```bash
isort --profile black --line-length 100
```
[pyupgrade](https://pypi.org/project/pyupgrade/) is used to automatically upgrade syntax, using those parameters:
```bash
pyupgrade --keep-percent-format --py37-plus
```
**Note:** There is `.pre-commit-config.yaml` to use [pre-commit](https://pre-commit.com/) to automatically run these tools before commits. After cloning the repository, execute `pre-commit install` to install the git hook.
## Copyright

View file

@ -9,11 +9,27 @@ cd $( dirname $0 )
echo "Clean previous build..."
rm -fr dist deb_dist
if [ -n "$CI" -a $UID -eq 0 ]
then
echo "CI environment detected, set current directory as git safe for root"
git config --global --add safe.directory $(pwd)
fi
echo "Detect version using git describe..."
VERSION="$( git describe --tags|sed 's/^[^0-9]*//' )"
echo "Set version=$VERSION in setup.py using sed..."
sed -i "s/^version *=.*$/version = '$VERSION'/" setup.py
echo "Computing python version..."
if [ $( echo "$VERSION"|grep -c "-" ) -gt 0 ]
then
echo "Development version detected ($VERSION), compute custom python dev version"
PY_VERSION="$( echo "$VERSION"|sed 's/-\([0-9]\)\+-.*$/.dev\1/' )"
else
echo "Clean tagged version detected, use it"
PY_VERSION="$VERSION"
fi
echo "Set version=$PY_VERSION in setup.py using sed..."
sed -i "s/^version *=.*$/version = '$PY_VERSION'/" setup.py
if [ -d venv ]
then
@ -28,7 +44,7 @@ else
fi
echo "Install dependencies in virtualenv using pip..."
$VENV/bin/python3 -m pip install stdeb GitPython wheel $QUIET_ARG
$VENV/bin/python3 -m pip install stdeb wheel $QUIET_ARG
echo "Build wheel package..."
$VENV/bin/python3 setup.py bdist_wheel
@ -41,7 +57,9 @@ then
TMP_GITDCH=$(mktemp -d)
echo "Temporary install gitdch in $TMP_GITDCH..."
git clone $QUIET_ARG https://gitea.zionetrix.net/bn8/gitdch.git $TMP_GITDCH/gitdch
GITDCH=$TMP_GITDCH/gitdch/gitdch
GITDCH="$VENV/bin/python3 $TMP_GITDCH/gitdch/gitdch"
echo "Install gitdch dependencies in $VENV..."
$VENV/bin/python3 -m pip install GitPython $QUIET_ARG
else
TMP_GITDCH=""
echo "Use existing installation of gitdch ($GITDCH)"
@ -61,20 +79,49 @@ find deb_dist/ -maxdepth 1 -type f ! -name '*.orig.tar.gz' -delete
echo "Enter in debian package directory..."
cd deb_dist/mylib-$VERSION
echo "Retreive debian codename using lsb_release..."
DEBIAN_CODENAME=$( lsb_release -c -s )
[ $( lsb_release -r -s ) -ge 9 ] && DEBIAN_CODENAME="${DEBIAN_CODENAME}-ee"
if [ -z "$DEBIAN_CODENAME" ]
then
echo "Retrieve debian codename using lsb_release..."
DEBIAN_CODENAME=$( lsb_release -c -s )
[ $( lsb_release -r -s ) -ge 9 ] && DEBIAN_CODENAME="${DEBIAN_CODENAME}-ee"
else
echo "Use debian codename from environment ($DEBIAN_CODENAME)"
fi
# Compute debian package version
DEB_VERSION_SUFFIX="-1"
DEB_VERSION="$VERSION$DEB_VERSION_SUFFIX"
echo "Generate debian changelog using gitdch..."
GITDCH_LOG_ARG='--verbose'
[ -n "$QUIET_ARG" ] && GITDCH_LOG_ARG='--warning'
$VENV/bin/python3 $GITDCH \
GITDCH_ARGS=('--verbose')
[ -n "$QUIET_ARG" ] && GITDCH_ARGS=('--warning')
if [ -n "$MAINTAINER_NAME" ]
then
echo "Use maintainer name from environment ($MAINTAINER_NAME)"
GITDCH_ARGS+=("--maintainer-name" "${MAINTAINER_NAME}")
fi
if [ -n "$MAINTAINER_EMAIL" ]
then
echo "Use maintainer email from environment ($MAINTAINER_EMAIL)"
GITDCH_ARGS+=("--maintainer-email" "$MAINTAINER_EMAIL")
fi
$GITDCH \
--package-name mylib \
--version "${VERSION}" \
--version "${DEB_VERSION}" \
--version-suffix "${DEB_VERSION_SUFFIX}" \
--code-name $DEBIAN_CODENAME \
--output debian/changelog \
--release-notes ../../dist/release_notes.md \
--path ../../ \
$GITDCH_LOG_ARG
--exclude "^CI: " \
--exclude "^Docker: " \
--exclude "^pre-commit: " \
--exclude "\.?woodpecker(\.yml)?" \
--exclude "build(\.sh)?" \
--exclude "tests(\.sh)?" \
--exclude "README(\.md)?" \
--exclude "^Merge branch " \
"${GITDCH_ARGS[@]}"
echo "Add custom package name for dependencies..."
cat << EOF > debian/py3dist-overrides
@ -86,4 +133,4 @@ EOF
[ -n "$TMP_GITDCH" ] && echo "Clean temporary gitdch installation..." && rm -fr $TMP_GITDCH
echo "Build debian package..."
dpkg-buildpackage --no-sign
dpkg-buildpackage

View file

@ -0,0 +1,6 @@
FROM brenard/mylib:latest
RUN apt-get remove -y python3-mylib && \
git clone https://gitea.zionetrix.net/bn8/python-mylib.git /src && \
pip install --break-system-packages /src[dev] && \
cd /src && \
pre-commit run --all-files

26
docker/latest/Dockerfile Normal file
View file

@ -0,0 +1,26 @@
FROM node:16-bookworm-slim
RUN echo "deb http://debian.zionetrix.net stable main" > /etc/apt/sources.list.d/zionetrix.list && \
apt-get \
-o Acquire::AllowInsecureRepositories=true \
-o Acquire::AllowDowngradeToInsecureRepositories=true \
update && \
apt-get \
-o APT::Get::AllowUnauthenticated=true \
install --yes zionetrix-archive-keyring && \
apt-get update && \
apt-get upgrade -y && \
apt-get install -y \
python3-all python3-dev python3-pip python3-venv python3-mylib build-essential git \
libldap2-dev libsasl2-dev \
pkg-config libsystemd-dev \
libpq-dev libmariadb-dev \
wget unzip && \
apt-get clean && \
rm -fr rm -rf /var/lib/apt/lists/*
RUN python3 -m pip install --break-system-packages pylint pytest flake8 flake8-junit-report pylint-junit junitparser pre-commit
RUN wget --no-verbose \
-O /opt/instantclient-basic-linux.x64-21.4.0.0.0dbru.zip \
https://download.oracle.com/otn_software/linux/instantclient/214000/instantclient-basic-linux.x64-21.4.0.0.0dbru.zip && \
unzip -qq -d /opt /opt/instantclient-basic-linux.x64-21.4.0.0.0dbru.zip && \
echo /opt/instantclient_* > /etc/ld.so.conf.d/oracle-instantclient.conf && \
ldconfig

View file

@ -1,80 +1,87 @@
# -*- coding: utf-8 -*-
""" Some really common helper functions """
#
# Pretty formating helpers
# Pretty formatting helpers
#
def increment_prefix(prefix):
return "%s " % prefix if prefix else " "
"""Increment the given prefix with two spaces"""
return f'{prefix if prefix else " "} '
def pretty_format_value(value, encoding='utf8', prefix=None):
""" Returned pretty formated value to display """
def pretty_format_value(value, encoding="utf8", prefix=None):
"""Returned pretty formatted value to display"""
if isinstance(value, dict):
return pretty_format_dict(value, encoding=encoding, prefix=prefix)
if isinstance(value, list):
return pretty_format_list(value, encoding=encoding, prefix=prefix)
if isinstance(value, bytes):
return "'%s'" % value.decode(encoding, errors='replace')
return f"'{value.decode(encoding, errors='replace')}'"
if isinstance(value, str):
return "'%s'" % value
return f"'{value}'"
if value is None:
return "None"
return "%s (%s)" % (str(value), type(value))
return f"{value} ({type(value)})"
def pretty_format_value_in_list(value, encoding='utf8', prefix=None):
def pretty_format_value_in_list(value, encoding="utf8", prefix=None):
"""
Returned pretty formated value to display in list
Returned pretty formatted value to display in list
That method will prefix value with line return and incremented prefix
if pretty formated value contains line return.
if pretty formatted value contains line return.
"""
prefix = prefix if prefix else ""
value = pretty_format_value(value, encoding, prefix)
if '\n' in value:
if "\n" in value:
inc_prefix = increment_prefix(prefix)
value = "\n" + "\n".join([
inc_prefix + line
for line in value.split('\n')
])
value = "\n" + "\n".join([inc_prefix + line for line in value.split("\n")])
return value
def pretty_format_dict(value, encoding='utf8', prefix=None):
""" Returned pretty formated dict to display """
def pretty_format_dict(value, encoding="utf8", prefix=None):
"""Returned pretty formatted dict to display"""
prefix = prefix if prefix else ""
result = []
for key in sorted(value.keys()):
result.append(
"%s- %s : %s" % (
prefix, key,
pretty_format_value_in_list(
value[key],
encoding=encoding,
prefix=prefix
)
)
f"{prefix}- {key} : "
+ pretty_format_value_in_list(value[key], encoding=encoding, prefix=prefix)
)
return "\n".join(result)
def pretty_format_list(row, encoding='utf8', prefix=None):
""" Returned pretty formated list to display """
def pretty_format_list(row, encoding="utf8", prefix=None):
"""Returned pretty formatted list to display"""
prefix = prefix if prefix else ""
result = []
for idx, values in enumerate(row):
result.append(
"%s- #%s : %s" % (
prefix, idx,
pretty_format_value_in_list(
values,
encoding=encoding,
prefix=prefix
)
)
f"{prefix}- #{idx} : "
+ pretty_format_value_in_list(values, encoding=encoding, prefix=prefix)
)
return "\n".join(result)
def pretty_format_timedelta(timedelta):
"""Format timedelta object"""
seconds = int(timedelta.total_seconds())
if seconds < 1:
return "less than one second"
periods = [
("year", 60 * 60 * 24 * 365),
("month", 60 * 60 * 24 * 30),
("day", 60 * 60 * 24),
("hour", 60 * 60),
("minute", 60),
("second", 1),
]
strings = []
for period_name, period_seconds in periods:
if seconds >= period_seconds:
period_value, seconds = divmod(seconds, period_seconds)
strings.append(f'{period_value} {period_name}{"s" if period_value > 1 else ""}')
return ", ".join(strings)

File diff suppressed because it is too large Load diff

411
mylib/db.py Normal file
View file

@ -0,0 +1,411 @@
""" Basic SQL DB client """
import logging
log = logging.getLogger(__name__)
#
# Exceptions
#
class DBException(Exception):
"""That is the base exception class for all the other exceptions provided by this module."""
def __init__(self, error, *args, **kwargs):
for arg, value in kwargs.items():
setattr(self, arg, value)
super().__init__(error.format(*args, **kwargs))
class DBNotImplemented(DBException, RuntimeError):
"""
Raised when calling a method not implemented in child class
"""
def __init__(self, method, class_name):
super().__init__(
"The method {method} is not yet implemented in class {class_name}",
method=method,
class_name=class_name,
)
class DBFailToConnect(DBException, RuntimeError):
"""
Raised on connecting error occurred
"""
def __init__(self, uri):
super().__init__("An error occurred during database connection ({uri})", uri=uri)
class DBDuplicatedSQLParameter(DBException, KeyError):
"""
Raised when trying to set a SQL query parameter
and an other parameter with the same name is already set
"""
def __init__(self, parameter_name):
super().__init__(
"Duplicated SQL parameter '{parameter_name}'", parameter_name=parameter_name
)
class DBUnsupportedWHEREClauses(DBException, TypeError):
"""
Raised when trying to execute query with unsupported
WHERE clauses provided
"""
def __init__(self, where_clauses):
super().__init__("Unsupported WHERE clauses: {where_clauses}", where_clauses=where_clauses)
class DBInvalidOrderByClause(DBException, TypeError):
"""
Raised when trying to select on table with invalid
ORDER BY clause provided
"""
def __init__(self, order_by):
super().__init__(
"Invalid ORDER BY clause: {order_by}. Must be a string or a list of two values"
" (ordering field name and direction)",
order_by=order_by,
)
class DBInvalidLimitClause(DBException, TypeError):
"""
Raised when trying to select on table with invalid
LIMIT clause provided
"""
def __init__(self, limit):
super().__init__(
"Invalid LIMIT clause: {limit}. Must be a non-zero positive integer.",
limit=limit,
)
class DB:
"""Database client"""
just_try = False
def __init__(self, just_try=False, **kwargs):
self.just_try = just_try
self._conn = None
for arg, value in kwargs.items():
setattr(self, f"_{arg}", value)
def connect(self, exit_on_error=True):
"""Connect to DB server"""
raise DBNotImplemented("connect", self.__class__.__name__)
def close(self):
"""Close connection with DB server (if opened)"""
if self._conn:
self._conn.close()
self._conn = None
@staticmethod
def _log_query(sql, params):
log.debug(
'Run SQL query "%s" %s',
sql,
"with params = {}".format( # pylint: disable=consider-using-f-string
", ".join([f"{key} = {value}" for key, value in params.items()])
if params
else "without params"
),
)
@staticmethod
def _log_query_exception(sql, params):
log.exception(
'Error during SQL query "%s" %s',
sql,
"with params = {}".format( # pylint: disable=consider-using-f-string
", ".join([f"{key} = {value}" for key, value in params.items()])
if params
else "without params"
),
)
def doSQL(self, sql, params=None):
"""
Run SQL query and commit changes (rollback on error)
:param sql: The SQL query
:param params: The SQL query's parameters as dict (optional)
:return: True on success, False otherwise
:rtype: bool
"""
raise DBNotImplemented("doSQL", self.__class__.__name__)
def doSelect(self, sql, params=None):
"""
Run SELECT SQL query and return list of selected rows as dict
:param sql: The SQL query
:param params: The SQL query's parameters as dict (optional)
:return: List of selected rows as dict on success, False otherwise
:rtype: list, bool
"""
raise DBNotImplemented("doSelect", self.__class__.__name__)
#
# SQL helpers
#
@staticmethod
def _quote_table_name(table):
"""Quote table name"""
return '"{}"'.format( # pylint: disable=consider-using-f-string
'"."'.join(table.split("."))
)
@staticmethod
def _quote_field_name(field):
"""Quote table name"""
return f'"{field}"'
@staticmethod
def format_param(param):
"""Format SQL query parameter for prepared query"""
return f"%({param})s"
@classmethod
def _combine_params(cls, params, to_add=None, **kwargs):
if to_add:
assert isinstance(to_add, dict), "to_add must be a dict or None"
params = cls._combine_params(params, **to_add)
for param, value in kwargs.items():
if param in params:
raise DBDuplicatedSQLParameter(param)
params[param] = value
return params
@classmethod
def _format_where_clauses(cls, where_clauses, params=None, where_op=None):
"""
Format WHERE clauses
:param where_clauses: The WHERE clauses. Could be:
- a raw SQL WHERE clause as string
- a tuple of two elements: a raw WHERE clause and its parameters as dict
- a dict of WHERE clauses with field name as key and WHERE clause value as value
- a list of any of previous valid WHERE clauses
:param params: Dict of other already set SQL query parameters (optional)
:param where_op: SQL operator used to combine WHERE clauses together (optional, default:
AND)
:return: A tuple of two elements: raw SQL WHERE combined clauses and parameters on success
:rtype: string, bool
"""
if params is None:
params = {}
if where_op is None:
where_op = "AND"
if isinstance(where_clauses, str):
return (where_clauses, params)
if (
isinstance(where_clauses, tuple)
and len(where_clauses) == 2
and isinstance(where_clauses[1], dict)
):
cls._combine_params(params, where_clauses[1])
return (where_clauses[0], params)
if isinstance(where_clauses, (list, tuple)):
sql_where_clauses = []
for where_clause in where_clauses:
sql2, params = cls._format_where_clauses(
where_clause, params=params, where_op=where_op
)
sql_where_clauses.append(sql2)
return (f" {where_op} ".join(sql_where_clauses), params)
if isinstance(where_clauses, dict):
sql_where_clauses = []
for field, value in where_clauses.items():
param = field
if field in params:
idx = 1
while param in params:
param = f"{field}_{idx}"
idx += 1
cls._combine_params(params, {param: value})
sql_where_clauses.append(
f"{cls._quote_field_name(field)} = {cls.format_param(param)}"
)
return (f" {where_op} ".join(sql_where_clauses), params)
raise DBUnsupportedWHEREClauses(where_clauses)
@classmethod
def _add_where_clauses(cls, sql, params, where_clauses, where_op=None):
"""
Add WHERE clauses to an SQL query
:param sql: The SQL query to complete
:param params: The dict of parameters of the SQL query to complete
:param where_clauses: The WHERE clause (see _format_where_clauses())
:param where_op: SQL operator used to combine WHERE clauses together (optional, default:
see _format_where_clauses())
:return:
:rtype: A tuple of two elements: raw SQL WHERE combined clauses and parameters
"""
if where_clauses:
sql_where, params = cls._format_where_clauses(
where_clauses, params=params, where_op=where_op
)
sql += " WHERE " + sql_where
return (sql, params)
def insert(self, table, values, just_try=False):
"""Run INSERT SQL query"""
# pylint: disable=consider-using-f-string
sql = "INSERT INTO {} ({}) VALUES ({})".format( # nosec
self._quote_table_name(table),
", ".join([self._quote_field_name(field) for field in values.keys()]),
", ".join([self.format_param(key) for key in values]),
)
if just_try:
log.debug("Just-try mode: execute INSERT query: %s", sql)
return True
log.debug(sql)
if not self.doSQL(sql, params=values):
log.error("Fail to execute INSERT query (SQL: %s)", sql)
return False
return True
def update(self, table, values, where_clauses, where_op=None, just_try=False):
"""Run UPDATE SQL query"""
# pylint: disable=consider-using-f-string
sql = "UPDATE {} SET {}".format( # nosec
self._quote_table_name(table),
", ".join(
[f"{self._quote_field_name(key)} = {self.format_param(key)}" for key in values]
),
)
params = values
try:
sql, params = self._add_where_clauses(sql, params, where_clauses, where_op=where_op)
except (DBDuplicatedSQLParameter, DBUnsupportedWHEREClauses):
log.error("Fail to add WHERE clauses", exc_info=True)
return False
if just_try:
log.debug("Just-try mode: execute UPDATE query: %s", sql)
return True
log.debug(sql)
if not self.doSQL(sql, params=params):
log.error("Fail to execute UPDATE query (SQL: %s)", sql)
return False
return True
def delete(self, table, where_clauses, where_op="AND", just_try=False):
"""Run DELETE SQL query"""
sql = f"DELETE FROM {self._quote_table_name(table)}" # nosec
params = {}
try:
sql, params = self._add_where_clauses(sql, params, where_clauses, where_op=where_op)
except (DBDuplicatedSQLParameter, DBUnsupportedWHEREClauses):
log.error("Fail to add WHERE clauses", exc_info=True)
return False
if just_try:
log.debug("Just-try mode: execute UPDATE query: %s", sql)
return True
log.debug(sql)
if not self.doSQL(sql, params=params):
log.error("Fail to execute UPDATE query (SQL: %s)", sql)
return False
return True
def truncate(self, table, just_try=False):
"""Run TRUNCATE SQL query"""
sql = f"TRUNCATE TABLE {self._quote_table_name(table)}" # nosec
if just_try:
log.debug("Just-try mode: execute TRUNCATE query: %s", sql)
return True
log.debug(sql)
if not self.doSQL(sql):
log.error("Fail to execute TRUNCATE query (SQL: %s)", sql)
return False
return True
def select(
self,
table,
where_clauses=None,
fields=None,
where_op="AND",
order_by=None,
limit=None,
just_try=False,
):
"""Run SELECT SQL query"""
sql = "SELECT "
if fields is None:
sql += "*"
elif isinstance(fields, str):
sql += f"{self._quote_field_name(fields)}"
else:
sql += ", ".join([self._quote_field_name(field) for field in fields])
sql += f" FROM {self._quote_table_name(table)}"
params = {}
try:
sql, params = self._add_where_clauses(sql, params, where_clauses, where_op=where_op)
except (DBDuplicatedSQLParameter, DBUnsupportedWHEREClauses):
log.error("Fail to add WHERE clauses", exc_info=True)
return False
if order_by:
if isinstance(order_by, str):
sql += f" ORDER BY {order_by}"
elif (
isinstance(order_by, (list, tuple))
and len(order_by) == 2
and isinstance(order_by[0], str)
and isinstance(order_by[1], str)
and order_by[1].upper() in ("ASC", "UPPER")
):
sql += f' ORDER BY "{order_by[0]}" {order_by[1].upper()}'
else:
raise DBInvalidOrderByClause(order_by)
if limit:
if not isinstance(limit, int):
try:
limit = int(limit)
except ValueError as err:
raise DBInvalidLimitClause(limit) from err
if limit <= 0:
raise DBInvalidLimitClause(limit)
sql += f" LIMIT {limit}"
if just_try:
log.debug("Just-try mode: execute SELECT query : %s", sql)
return just_try
return self.doSelect(sql, params=params)

View file

@ -1,114 +1,197 @@
# -*- coding: utf-8 -*-
""" Email client to forge and send emails """
import email.utils
import logging
import os
import smtplib
import email.utils
from email.mime.text import MIMEText
from email.mime.multipart import MIMEMultipart
from email.mime.base import MIMEBase
from email.encoders import encode_base64
from email.mime.base import MIMEBase
from email.mime.multipart import MIMEMultipart
from email.mime.text import MIMEText
from mako.template import Template as MakoTemplate
from mylib.config import ConfigurableObject
from mylib.config import BooleanOption
from mylib.config import IntegerOption
from mylib.config import PasswordOption
from mylib.config import StringOption
from mylib.config import (
BooleanOption,
ConfigurableObject,
IntegerOption,
PasswordOption,
StringOption,
)
log = logging.getLogger(__name__)
class EmailClient(ConfigurableObject): # pylint: disable=useless-object-inheritance,too-many-instance-attributes
class EmailClient(
ConfigurableObject
): # pylint: disable=useless-object-inheritance,too-many-instance-attributes
"""
Email client
This class abstract all interactions with the SMTP server.
"""
_config_name = 'email'
_config_comment = 'Email'
_config_name = "email"
_config_comment = "Email"
_defaults = {
'smtp_host': 'localhost',
'smtp_port': 25,
'smtp_ssl': False,
'smtp_tls': False,
'smtp_user': None,
'smtp_password': None,
'smtp_debug': False,
'sender_name': 'No reply',
'sender_email': 'noreply@localhost',
'encoding': 'utf-8',
'catch_all_addr': None,
'just_try': False,
"smtp_host": "localhost",
"smtp_port": 25,
"smtp_ssl": False,
"smtp_tls": False,
"smtp_user": None,
"smtp_password": None,
"smtp_debug": False,
"sender_name": "No reply",
"sender_email": "noreply@localhost",
"encoding": "utf-8",
"catch_all_addr": None,
"just_try": False,
"templates_path": None,
}
templates = dict()
templates = {}
def __init__(self, templates=None, **kwargs):
def __init__(self, templates=None, initialize=False, **kwargs):
super().__init__(**kwargs)
assert templates is None or isinstance(templates, dict)
self.templates = templates if templates else dict()
self.templates = templates if templates else {}
if initialize:
self.initialize()
def configure(self, use_smtp=True, just_try=True, ** kwargs): # pylint: disable=arguments-differ
""" Configure options on registered mylib.Config object """
section = super().configure(**kwargs)
# pylint: disable=arguments-differ,arguments-renamed
def configure(self, use_smtp=True, **kwargs):
"""Configure options on registered mylib.Config object"""
section = super().configure(
just_try_help=kwargs.pop("just_try_help", "Just-try mode: do not really send emails"),
**kwargs,
)
if use_smtp:
section.add_option(
StringOption, 'smtp_host', default=self._defaults['smtp_host'],
comment='SMTP server hostname/IP address')
StringOption,
"smtp_host",
default=self._defaults["smtp_host"],
comment="SMTP server hostname/IP address",
)
section.add_option(
IntegerOption, 'smtp_port', default=self._defaults['smtp_port'],
comment='SMTP server port')
IntegerOption,
"smtp_port",
default=self._defaults["smtp_port"],
comment="SMTP server port",
)
section.add_option(
BooleanOption, 'smtp_ssl', default=self._defaults['smtp_ssl'],
comment='Use SSL on SMTP server connection')
BooleanOption,
"smtp_ssl",
default=self._defaults["smtp_ssl"],
comment="Use SSL on SMTP server connection",
)
section.add_option(
BooleanOption, 'smtp_tls', default=self._defaults['smtp_tls'],
comment='Use TLS on SMTP server connection')
BooleanOption,
"smtp_tls",
default=self._defaults["smtp_tls"],
comment="Use TLS on SMTP server connection",
)
section.add_option(
StringOption, 'smtp_user', default=self._defaults['smtp_user'],
comment='SMTP authentication username')
StringOption,
"smtp_user",
default=self._defaults["smtp_user"],
comment="SMTP authentication username",
)
section.add_option(
PasswordOption, 'smtp_password', default=self._defaults['smtp_password'],
PasswordOption,
"smtp_password",
default=self._defaults["smtp_password"],
comment='SMTP authentication password (set to "keyring" to use XDG keyring)',
username_option='smtp_user', keyring_value='keyring')
username_option="smtp_user",
keyring_value="keyring",
)
section.add_option(
BooleanOption, 'smtp_debug', default=self._defaults['smtp_debug'],
comment='Enable SMTP debugging')
BooleanOption,
"smtp_debug",
default=self._defaults["smtp_debug"],
comment="Enable SMTP debugging",
)
section.add_option(
StringOption, 'sender_name', default=self._defaults['sender_name'],
comment='Sender name')
StringOption,
"sender_name",
default=self._defaults["sender_name"],
comment="Sender name",
)
section.add_option(
StringOption, 'sender_email', default=self._defaults['sender_email'],
comment='Sender email address')
StringOption,
"sender_email",
default=self._defaults["sender_email"],
comment="Sender email address",
)
section.add_option(
StringOption, 'encoding', default=self._defaults['encoding'],
comment='Email encoding')
StringOption, "encoding", default=self._defaults["encoding"], comment="Email encoding"
)
section.add_option(
StringOption, 'catch_all_addr', default=self._defaults['catch_all_addr'],
comment='Catch all sent emails to this specified email address')
StringOption,
"catch_all_addr",
default=self._defaults["catch_all_addr"],
comment="Catch all sent emails to this specified email address",
)
if just_try:
section.add_option(
BooleanOption, 'just_try', default=self._defaults['just_try'],
comment='Just-try mode: do not really send emails')
section.add_option(
StringOption,
"templates_path",
default=self._defaults["templates_path"],
comment="Path to templates directory",
)
return section
def forge_message(self, rcpt_to, subject=None, html_body=None, text_body=None, # pylint: disable=too-many-arguments,too-many-locals
attachment_files=None, attachment_payloads=None, sender_name=None,
sender_email=None, encoding=None, template=None, **template_vars):
def initialize(self, *args, **kwargs): # pylint: disable=arguments-differ
"""Configuration initialized hook"""
super().initialize(*args, **kwargs)
self.load_templates_directory()
def load_templates_directory(self, templates_path=None):
"""Load templates from specified directory"""
if templates_path is None:
templates_path = self._get_option("templates_path")
if not templates_path:
return
log.debug("Load email templates from %s directory", templates_path)
for filename in os.listdir(templates_path):
filepath = os.path.join(templates_path, filename)
if not os.path.isfile(filepath):
continue
template_name, template_type = os.path.splitext(filename)
if template_type not in [".html", ".txt", ".subject"]:
continue
template_type = "text" if template_type == ".txt" else template_type[1:]
if template_name not in self.templates:
self.templates[template_name] = {}
log.debug("Load email template %s %s from %s", template_name, template_type, filepath)
with open(filepath, encoding="utf8") as file_desc:
self.templates[template_name][template_type] = MakoTemplate(
file_desc.read()
) # nosec
def forge_message(
self,
recipients,
subject=None,
html_body=None,
text_body=None, # pylint: disable=too-many-arguments,too-many-locals
attachment_files=None,
attachment_payloads=None,
sender_name=None,
sender_email=None,
encoding=None,
template=None,
cc=None,
**template_vars,
):
"""
Forge a message
:param rcpt_to: The recipient of the email. Could be a tuple(name, email) or just the email of the recipient.
:param recipients: The recipient(s) of the email. List of tuple(name, email) or
just the email of the recipients.
:param subject: The subject of the email.
:param html_body: The HTML body of the email
:param text_body: The plain text body of the email
@ -118,269 +201,318 @@ class EmailClient(ConfigurableObject): # pylint: disable=useless-object-inherit
:param sender_email: Custom sender email (default: as defined on initialization)
:param encoding: Email content encoding (default: as defined on initialization)
:param template: The name of a template to use to forge this email
:param cc: Optional list of CC recipient addresses.
List of tuple(name, email) or just the email of the recipients.
All other parameters will be consider as template variables.
"""
msg = MIMEMultipart('alternative')
msg['To'] = email.utils.formataddr(rcpt_to) if isinstance(rcpt_to, tuple) else rcpt_to
msg['From'] = email.utils.formataddr(
recipients = [recipients] if not isinstance(recipients, list) else recipients
msg = MIMEMultipart("alternative")
msg["To"] = ", ".join(
[
email.utils.formataddr(recipient) if isinstance(recipient, tuple) else recipient
for recipient in recipients
]
)
if cc:
cc = [cc] if not isinstance(cc, list) else cc
msg["Cc"] = ", ".join(
[
email.utils.formataddr(recipient) if isinstance(recipient, tuple) else recipient
for recipient in cc
]
)
msg["From"] = email.utils.formataddr(
(
sender_name or self._get_option('sender_name'),
sender_email or self._get_option('sender_email')
sender_name or self._get_option("sender_name"),
sender_email or self._get_option("sender_email"),
)
)
if subject:
msg['Subject'] = subject.format(**template_vars)
msg['Date'] = email.utils.formatdate(None, True)
encoding = encoding if encoding else self._get_option('encoding')
msg["Subject"] = (
subject.render(**template_vars)
if isinstance(subject, MakoTemplate)
else subject.format(**template_vars)
)
msg["Date"] = email.utils.formatdate(None, True)
encoding = encoding if encoding else self._get_option("encoding")
if template:
assert template in self.templates, "Unknwon template %s" % template
assert template in self.templates, f"Unknown template {template}"
# Handle subject from template
if not subject:
assert self.templates[template].get('subject'), 'No subject defined in template %s' % template
msg['Subject'] = self.templates[template]['subject'].format(**template_vars)
assert self.templates[template].get(
"subject"
), f"No subject defined in template {template}"
msg["Subject"] = (
self.templates[template]["subject"].render(**template_vars)
if isinstance(self.templates[template]["subject"], MakoTemplate)
else self.templates[template]["subject"].format(**template_vars)
)
# Put HTML part in last one to prefered it
# Put HTML part in last one to preferred it
parts = []
if self.templates[template].get('text'):
if isinstance(self.templates[template]['text'], MakoTemplate):
parts.append((self.templates[template]['text'].render(**template_vars), 'plain'))
if self.templates[template].get("text"):
if isinstance(self.templates[template]["text"], MakoTemplate):
parts.append(
(self.templates[template]["text"].render(**template_vars), "plain")
)
else:
parts.append((self.templates[template]['text'].format(**template_vars), 'plain'))
if self.templates[template].get('html'):
if isinstance(self.templates[template]['html'], MakoTemplate):
parts.append((self.templates[template]['html'].render(**template_vars), 'html'))
parts.append(
(self.templates[template]["text"].format(**template_vars), "plain")
)
if self.templates[template].get("html"):
if isinstance(self.templates[template]["html"], MakoTemplate):
parts.append((self.templates[template]["html"].render(**template_vars), "html"))
else:
parts.append((self.templates[template]['html'].format(**template_vars), 'html'))
parts.append((self.templates[template]["html"].format(**template_vars), "html"))
for body, mime_type in parts:
msg.attach(MIMEText(body.encode(encoding), mime_type, _charset=encoding))
else:
assert subject, 'No subject provided'
assert subject, "No subject provided"
if text_body:
msg.attach(MIMEText(text_body.encode(encoding), 'plain', _charset=encoding))
msg.attach(MIMEText(text_body.encode(encoding), "plain", _charset=encoding))
if html_body:
msg.attach(MIMEText(html_body.encode(encoding), 'html', _charset=encoding))
msg.attach(MIMEText(html_body.encode(encoding), "html", _charset=encoding))
if attachment_files:
for filepath in attachment_files:
with open(filepath, 'rb') as fp:
part = MIMEBase('application', "octet-stream")
with open(filepath, "rb") as fp:
part = MIMEBase("application", "octet-stream")
part.set_payload(fp.read())
encode_base64(part)
part.add_header('Content-Disposition', 'attachment; filename="%s"' % os.path.basename(filepath))
part.add_header(
"Content-Disposition",
f'attachment; filename="{os.path.basename(filepath)}"',
)
msg.attach(part)
if attachment_payloads:
for filename, payload in attachment_payloads:
part = MIMEBase('application', "octet-stream")
part = MIMEBase("application", "octet-stream")
part.set_payload(payload)
encode_base64(part)
part.add_header('Content-Disposition', 'attachment; filename="%s"' % filename)
part.add_header("Content-Disposition", f'attachment; filename="{filename}"')
msg.attach(part)
return msg
def send(self, rcpt_to, msg=None, subject=None, just_try=False, **forge_args):
def send(
self, recipients, msg=None, subject=None, just_try=None, cc=None, bcc=None, **forge_args
):
"""
Send an email
:param rcpt_to: The recipient of the email. Could be a tuple(name, email)
or just the email of the recipient.
:param recipients: The recipient(s) of the email. List of tuple(name, email) or
just the email of the recipients.
:param msg: The message of this email (as MIMEBase or derivated classes)
:param subject: The subject of the email (only if the message is not provided
using msg parameter)
:param just_try: Enable just try mode (do not really send email, default: as defined on initialization)
:param just_try: Enable just try mode (do not really send email, default: as defined on
initialization)
:param cc: Optional list of CC recipient addresses. List of tuple(name, email) or
just the email of the recipients.
:param bcc: Optional list of BCC recipient addresses. List of tuple(name, email) or
just the email of the recipients.
All other parameters will be consider as parameters to forge the message
(only if the message is not provided using msg parameter).
"""
msg = msg if msg else self.forge_message(rcpt_to, subject, **forge_args)
recipients = [recipients] if not isinstance(recipients, list) else recipients
msg = msg if msg else self.forge_message(recipients, subject, cc=cc, **forge_args)
catch_addr = self._get_option("catch_all_addr")
if catch_addr:
log.debug(
"Catch email originally send to %s (CC:%s, BCC:%s) to %s",
", ".join(recipients),
", ".join(cc) if isinstance(cc, list) else cc,
", ".join(bcc) if isinstance(bcc, list) else bcc,
catch_addr,
)
recipients = catch_addr if isinstance(catch_addr, list) else [catch_addr]
else:
if cc:
recipients.extend(
[
recipient[1] if isinstance(recipient, tuple) else recipient
for recipient in (cc if isinstance(cc, list) else [cc])
]
)
if bcc:
recipients.extend(
[
recipient[1] if isinstance(recipient, tuple) else recipient
for recipient in (bcc if isinstance(bcc, list) else [bcc])
]
)
if just_try or self._get_option('just_try'):
log.debug('Just-try mode: do not really send this email to %s (subject="%s")', rcpt_to, subject or msg.get('subject', 'No subject'))
if just_try if just_try is not None else self._just_try:
log.debug(
'Just-try mode: do not really send this email to %s (subject="%s")',
", ".join(recipients),
subject or msg.get("subject", "No subject"),
)
return True
catch_addr = self._get_option('catch_all_addr')
if catch_addr:
log.debug('Catch email originaly send to %s to %s', rcpt_to, catch_addr)
rcpt_to = catch_addr
smtp_host = self._get_option('smtp_host')
smtp_port = self._get_option('smtp_port')
smtp_host = self._get_option("smtp_host")
smtp_port = self._get_option("smtp_port")
try:
if self._get_option('smtp_ssl'):
if self._get_option("smtp_ssl"):
logging.info("Establish SSL connection to server %s:%s", smtp_host, smtp_port)
server = smtplib.SMTP_SSL(smtp_host, smtp_port)
else:
logging.info("Establish connection to server %s:%s", smtp_host, smtp_port)
server = smtplib.SMTP(smtp_host, smtp_port)
if self._get_option('smtp_tls'):
logging.info('Start TLS on SMTP connection')
if self._get_option("smtp_tls"):
logging.info("Start TLS on SMTP connection")
server.starttls()
except smtplib.SMTPException:
log.error('Error connecting to SMTP server %s:%s', smtp_host, smtp_port, exc_info=True)
log.error("Error connecting to SMTP server %s:%s", smtp_host, smtp_port, exc_info=True)
return False
if self._get_option('smtp_debug'):
if self._get_option("smtp_debug"):
server.set_debuglevel(True)
smtp_user = self._get_option('smtp_user')
smtp_password = self._get_option('smtp_password')
smtp_user = self._get_option("smtp_user")
smtp_password = self._get_option("smtp_password")
if smtp_user and smtp_password:
try:
log.info('Try to authenticate on SMTP connection as %s', smtp_user)
log.info("Try to authenticate on SMTP connection as %s", smtp_user)
server.login(smtp_user, smtp_password)
except smtplib.SMTPException:
log.error(
'Error authenticating on SMTP server %s:%s with user %s',
smtp_host, smtp_port, smtp_user, exc_info=True)
"Error authenticating on SMTP server %s:%s with user %s",
smtp_host,
smtp_port,
smtp_user,
exc_info=True,
)
return False
error = False
try:
log.info('Sending email to %s', rcpt_to)
log.info("Sending email to %s", ", ".join(recipients))
server.sendmail(
self._get_option('sender_email'),
[rcpt_to[1] if isinstance(rcpt_to, tuple) else rcpt_to],
msg.as_string()
self._get_option("sender_email"),
[
recipient[1] if isinstance(recipient, tuple) else recipient
for recipient in recipients
],
msg.as_string(),
)
except smtplib.SMTPException:
error = True
log.error('Error sending email to %s', rcpt_to, exc_info=True)
log.error("Error sending email to %s", ", ".join(recipients), exc_info=True)
finally:
server.quit()
return not error
if __name__ == '__main__':
if __name__ == "__main__":
# Run tests
import argparse
import datetime
import sys
import argparse
# Options parser
parser = argparse.ArgumentParser()
parser.add_argument(
'-v', '--verbose',
action="store_true",
dest="verbose",
help="Enable verbose mode"
"-v", "--verbose", action="store_true", dest="verbose", help="Enable verbose mode"
)
parser.add_argument(
'-d', '--debug',
action="store_true",
dest="debug",
help="Enable debug mode"
"-d", "--debug", action="store_true", dest="debug", help="Enable debug mode"
)
parser.add_argument(
'-l', '--log-file',
action="store",
type=str,
dest="logfile",
help="Log file path"
"-l", "--log-file", action="store", type=str, dest="logfile", help="Log file path"
)
parser.add_argument(
'-j', '--just-try',
action="store_true",
dest="just_try",
help="Enable just-try mode"
"-j", "--just-try", action="store_true", dest="just_try", help="Enable just-try mode"
)
email_opts = parser.add_argument_group('Email options')
email_opts = parser.add_argument_group("Email options")
email_opts.add_argument(
'-H', '--smtp-host',
action="store",
type=str,
dest="email_smtp_host",
help="SMTP host"
"-H", "--smtp-host", action="store", type=str, dest="email_smtp_host", help="SMTP host"
)
email_opts.add_argument(
'-P', '--smtp-port',
action="store",
type=int,
dest="email_smtp_port",
help="SMTP port"
"-P", "--smtp-port", action="store", type=int, dest="email_smtp_port", help="SMTP port"
)
email_opts.add_argument(
'-S', '--smtp-ssl',
action="store_true",
dest="email_smtp_ssl",
help="Use SSL"
"-S", "--smtp-ssl", action="store_true", dest="email_smtp_ssl", help="Use SSL"
)
email_opts.add_argument(
'-T', '--smtp-tls',
action="store_true",
dest="email_smtp_tls",
help="Use TLS"
"-T", "--smtp-tls", action="store_true", dest="email_smtp_tls", help="Use TLS"
)
email_opts.add_argument(
'-u', '--smtp-user',
action="store",
type=str,
dest="email_smtp_user",
help="SMTP username"
"-u", "--smtp-user", action="store", type=str, dest="email_smtp_user", help="SMTP username"
)
email_opts.add_argument(
'-p', '--smtp-password',
"-p",
"--smtp-password",
action="store",
type=str,
dest="email_smtp_password",
help="SMTP password"
help="SMTP password",
)
email_opts.add_argument(
'-D', '--smtp-debug',
"-D",
"--smtp-debug",
action="store_true",
dest="email_smtp_debug",
help="Debug SMTP connection"
help="Debug SMTP connection",
)
email_opts.add_argument(
'-e', '--email-encoding',
"-e",
"--email-encoding",
action="store",
type=str,
dest="email_encoding",
help="SMTP encoding"
help="SMTP encoding",
)
email_opts.add_argument(
'-f', '--sender-name',
"-f",
"--sender-name",
action="store",
type=str,
dest="email_sender_name",
help="Sender name"
help="Sender name",
)
email_opts.add_argument(
'-F', '--sender-email',
"-F",
"--sender-email",
action="store",
type=str,
dest="email_sender_email",
help="Sender email"
help="Sender email",
)
email_opts.add_argument(
'-C', '--catch-all',
"-C",
"--catch-all",
action="store",
type=str,
dest="email_catch_all",
help="Catch all sent email: specify catch recipient email address"
help="Catch all sent email: specify catch recipient email address",
)
test_opts = parser.add_argument_group('Test email options')
test_opts = parser.add_argument_group("Test email options")
test_opts.add_argument(
'-t', '--to',
"-t",
"--to",
action="store",
type=str,
dest="test_to",
@ -388,7 +520,8 @@ if __name__ == '__main__':
)
test_opts.add_argument(
'-m', '--mako',
"-m",
"--mako",
action="store_true",
dest="test_mako",
help="Test mako templating",
@ -397,11 +530,11 @@ if __name__ == '__main__':
options = parser.parse_args()
if not options.test_to:
parser.error('You must specify test email recipient using -t/--to parameter')
parser.error("You must specify test email recipient using -t/--to parameter")
sys.exit(1)
# Initialize logs
logformat = '%(asctime)s - Test EmailClient - %(levelname)s - %(message)s'
logformat = "%(asctime)s - Test EmailClient - %(levelname)s - %(message)s"
if options.debug:
loglevel = logging.DEBUG
elif options.verbose:
@ -416,9 +549,10 @@ if __name__ == '__main__':
if options.email_smtp_user and not options.email_smtp_password:
import getpass
options.email_smtp_password = getpass.getpass('Please enter SMTP password: ')
logging.info('Initialize Email client')
options.email_smtp_password = getpass.getpass("Please enter SMTP password: ")
logging.info("Initialize Email client")
email_client = EmailClient(
smtp_host=options.email_smtp_host,
smtp_port=options.email_smtp_port,
@ -432,24 +566,29 @@ if __name__ == '__main__':
catch_all_addr=options.email_catch_all,
just_try=options.just_try,
encoding=options.email_encoding,
templates=dict(
test=dict(
subject="Test email",
text=(
"Just a test email sent at {sent_date}." if not options.test_mako else
MakoTemplate("Just a test email sent at ${sent_date}.")
templates={
"test": {
"subject": "Test email",
"text": (
"Just a test email sent at {sent_date}."
if not options.test_mako
else MakoTemplate("Just a test email sent at ${sent_date | h}.") # nosec
),
html=(
"<strong>Just a test email.</strong> <small>(sent at {sent_date})</small>" if not options.test_mako else
MakoTemplate("<strong>Just a test email.</strong> <small>(sent at ${sent_date})</small>")
)
)
)
"html": (
"<strong>Just a test email.</strong> <small>(sent at {sent_date | h})</small>"
if not options.test_mako
else MakoTemplate( # nosec
"<strong>Just a test email.</strong> "
"<small>(sent at ${sent_date | h})</small>"
)
),
}
},
)
logging.info('Send a test email to %s', options.test_to)
if email_client.send(options.test_to, template='test', sent_date=datetime.datetime.now()):
logging.info('Test email sent')
logging.info("Send a test email to %s", options.test_to)
if email_client.send(options.test_to, template="test", sent_date=datetime.datetime.now()):
logging.info("Test email sent")
sys.exit(0)
logging.error('Fail to send test email')
logging.error("Fail to send test email")
sys.exit(1)

File diff suppressed because it is too large Load diff

138
mylib/mapping.py Normal file
View file

@ -0,0 +1,138 @@
"""
My hash mapping library
Mapping configuration
{
'[dst key 1]': { # Key name in the result
'order': [int], # Processing order between destinations keys
# Source values
'other_key': [key], # Other key of the destination to use as source of values
'key' : '[src key]', # Key of source hash to get source values
'keys' : ['[sk1]', '[sk2]', ...], # List of source hash's keys to get source values
# Clean / convert values
'cleanRegex': '[regex]', # Regex that be use to remove unwanted characters. Ex : [^0-9+]
'convert': [function], # Function to use to convert value : Original value will be passed
# as argument and the value retrieve will replace source value in
# the result
# Ex :
# lambda x: x.strip()
# lambda x: "myformat : %s" % x
# Deduplicate / check values
'deduplicate': [bool], # If True, sources values will be depluplicated
'check': [function], # Function to use to check source value : Source value will be passed
# as argument and if function return True, the value will be preserved
# Ex :
# lambda x: x in my_global_hash
# Join values
'join': '[glue]', # If present, sources values will be join using the "glue"
# Alternative mapping
'or': { [map configuration] } # If this mapping case does not retrieve any value, try to
# get value(s) with this other mapping configuration
},
'[dst key 2]': {
[...]
}
}
Return format :
{
'[dst key 1]': ['v1','v2', ...],
'[dst key 2]': [ ... ],
[...]
}
"""
import logging
import re
log = logging.getLogger(__name__)
def clean_value(value):
"""Clean value as encoded string"""
if isinstance(value, int):
value = str(value)
return value
def get_values(dst, dst_key, src, m):
"""Extract sources values"""
values = []
if "other_key" in m:
if m["other_key"] in dst:
values = dst[m["other_key"]]
if "key" in m:
if m["key"] in src and src[m["key"]] != "":
values.append(clean_value(src[m["key"]]))
if "keys" in m:
for key in m["keys"]:
if key in src and src[key] != "":
values.append(clean_value(src[key]))
# Clean and convert values
if "cleanRegex" in m and len(values) > 0:
new_values = []
for v in values:
nv = re.sub(m["cleanRegex"], "", v)
if nv != "":
new_values.append(nv)
values = new_values
if "convert" in m and len(values) > 0:
new_values = []
for v in values:
nv = m["convert"](v)
if nv != "":
new_values.append(nv)
values = new_values
# Deduplicate values
if m.get("deduplicate") and len(values) > 1:
new_values = []
for v in values:
if v not in new_values:
new_values.append(v)
values = new_values
# Check values
if "check" in m and len(values) > 0:
new_values = []
for v in values:
if m["check"](v):
new_values.append(v)
else:
log.debug("Invalid value %s for key %s", v, dst_key)
values = new_values
# Join values
if "join" in m and len(values) > 1:
values = [m["join"].join(values)]
# Manage alternative mapping case
if len(values) == 0 and "or" in m:
values = get_values(dst, dst_key, src, m["or"])
return values
def map_hash(mapping, src, dst=None):
"""Map hash"""
dst = dst if dst else {}
assert isinstance(dst, dict)
for dst_key in sorted(mapping.keys(), key=lambda x: mapping[x]["order"]):
values = get_values(dst, dst_key, src, mapping[dst_key])
if len(values) == 0:
if "required" in mapping[dst_key] and mapping[dst_key]["required"]:
log.debug(
"Destination key %s could not be filled from source but is required", dst_key
)
return False
continue
dst[dst_key] = values
return dst

View file

@ -1,59 +1,112 @@
# -*- coding: utf-8 -*-
""" MySQL client """
import logging
import sys
import MySQLdb
from MySQLdb._exceptions import Error
from mylib.db import DB, DBFailToConnect
log = logging.getLogger(__name__)
class MyDB:
""" MySQL client """
class MyDB(DB):
"""MySQL client"""
host = ""
user = ""
pwd = ""
db = ""
_host = None
_user = None
_pwd = None
_db = None
con = 0
def __init__(self, host, user, pwd, db, charset=None, **kwargs):
self._host = host
self._user = user
self._pwd = pwd
self._db = db
self._charset = charset if charset else "utf8"
super().__init__(**kwargs)
def __init__(self, host, user, pwd, db):
self.host = host
self.user = user
self.pwd = pwd
self.db = db
def connect(self):
""" Connect to MySQL server """
if self.con == 0:
def connect(self, exit_on_error=True):
"""Connect to MySQL server"""
if self._conn is None:
try:
con = MySQLdb.connect(self.host, self.user, self.pwd, self.db)
self.con = con
except Exception:
log.fatal('Error connecting to MySQL server', exc_info=True)
sys.exit(1)
self._conn = MySQLdb.connect(
host=self._host,
user=self._user,
passwd=self._pwd,
db=self._db,
charset=self._charset,
use_unicode=True,
)
except Error as err:
log.fatal(
"An error occurred during MySQL database connection (%s@%s:%s).",
self._user,
self._host,
self._db,
exc_info=1,
)
if exit_on_error:
sys.exit(1)
else:
raise DBFailToConnect(f"{self._user}@{self._host}:{self._db}") from err
return True
def doSQL(self, sql):
""" Run INSERT/UPDATE/DELETE/... SQL query """
cursor = self.con.cursor()
try:
cursor.execute(sql)
self.con.commit()
def doSQL(self, sql, params=None):
"""
Run SQL query and commit changes (rollback on error)
:param sql: The SQL query
:param params: The SQL query's parameters as dict (optional)
:return: True on success, False otherwise
:rtype: bool
"""
if self.just_try:
log.debug("Just-try mode : do not really execute SQL query '%s'", sql)
return True
except Exception:
log.error('Error during SQL request "%s"', sql, exc_info=True)
self.con.rollback()
cursor = self._conn.cursor()
try:
self._log_query(sql, params)
cursor.execute(sql, params)
self._conn.commit()
return True
except Error:
self._log_query_exception(sql, params)
self._conn.rollback()
return False
def doSelect(self, sql):
""" Run SELECT SQL query and return result as dict """
cursor = self.con.cursor()
def doSelect(self, sql, params=None):
"""
Run SELECT SQL query and return list of selected rows as dict
:param sql: The SQL query
:param params: The SQL query's parameters as dict (optional)
:return: List of selected rows as dict on success, False otherwise
:rtype: list, bool
"""
try:
cursor.execute(sql)
return cursor.fetchall()
except Exception:
log.error('Error during SQL request "%s"', sql, exc_info=True)
self._log_query(sql, params)
cursor = self._conn.cursor()
cursor.execute(sql, params)
return [
{field[0]: row[idx] for idx, field in enumerate(cursor.description)}
for row in cursor.fetchall()
]
except Error:
self._log_query_exception(sql, params)
return False
@staticmethod
def _quote_table_name(table):
"""Quote table name"""
return "`{}`".format( # pylint: disable=consider-using-f-string
"`.`".join(table.split("."))
)
@staticmethod
def _quote_field_name(field):
"""Quote table name"""
return f"`{field}`"

View file

@ -1,21 +1,21 @@
# -*- coding: utf-8 -*-
""" Opening hours helpers """
import datetime
import logging
import re
import time
import logging
log = logging.getLogger(__name__)
week_days = ['lundi', 'mardi', 'mercredi', 'jeudi', 'vendredi', 'samedi', 'dimanche']
date_format = '%d/%m/%Y'
date_pattern = re.compile('^([0-9]{2})/([0-9]{2})/([0-9]{4})$')
time_pattern = re.compile('^([0-9]{1,2})h([0-9]{2})?$')
week_days = ["lundi", "mardi", "mercredi", "jeudi", "vendredi", "samedi", "dimanche"]
date_format = "%d/%m/%Y"
date_pattern = re.compile("^([0-9]{2})/([0-9]{2})/([0-9]{4})$")
time_pattern = re.compile("^([0-9]{1,2})h([0-9]{2})?$")
_nonworking_french_public_days_of_the_year_cache = {}
def easter_date(year):
"""Compute easter date for the specified year"""
a = year // 100
b = year % 100
c = (3 * (a + 25)) // 4
@ -35,28 +35,32 @@ def easter_date(year):
def nonworking_french_public_days_of_the_year(year=None):
"""Compute dict of nonworking french public days for the specified year"""
if year is None:
year = datetime.date.today().year
dp = easter_date(year)
return {
'1janvier': datetime.date(year, 1, 1),
'paques': dp,
'lundi_paques': (dp + datetime.timedelta(1)),
'1mai': datetime.date(year, 5, 1),
'8mai': datetime.date(year, 5, 8),
'jeudi_ascension': (dp + datetime.timedelta(39)),
'pentecote': (dp + datetime.timedelta(49)),
'lundi_pentecote': (dp + datetime.timedelta(50)),
'14juillet': datetime.date(year, 7, 14),
'15aout': datetime.date(year, 8, 15),
'1novembre': datetime.date(year, 11, 1),
'11novembre': datetime.date(year, 11, 11),
'noel': datetime.date(year, 12, 25),
'saint_etienne': datetime.date(year, 12, 26),
}
if year not in _nonworking_french_public_days_of_the_year_cache:
dp = easter_date(year)
_nonworking_french_public_days_of_the_year_cache[year] = {
"1janvier": datetime.date(year, 1, 1),
"paques": dp,
"lundi_paques": (dp + datetime.timedelta(1)),
"1mai": datetime.date(year, 5, 1),
"8mai": datetime.date(year, 5, 8),
"jeudi_ascension": (dp + datetime.timedelta(39)),
"pentecote": (dp + datetime.timedelta(49)),
"lundi_pentecote": (dp + datetime.timedelta(50)),
"14juillet": datetime.date(year, 7, 14),
"15aout": datetime.date(year, 8, 15),
"1novembre": datetime.date(year, 11, 1),
"11novembre": datetime.date(year, 11, 11),
"noel": datetime.date(year, 12, 25),
"saint_etienne": datetime.date(year, 12, 26),
}
return _nonworking_french_public_days_of_the_year_cache[year]
def parse_exceptional_closures(values):
"""Parse exceptional closures values"""
exceptional_closures = []
for value in values:
days = []
@ -65,7 +69,7 @@ def parse_exceptional_closures(values):
for word in words:
if not word:
continue
parts = word.split('-')
parts = word.split("-")
if len(parts) == 1:
# ex: 31/02/2017
ptime = time.strptime(word, date_format)
@ -79,7 +83,7 @@ def parse_exceptional_closures(values):
pstart = time.strptime(parts[0], date_format)
pstop = time.strptime(parts[1], date_format)
if pstop <= pstart:
raise ValueError('Day %s <= %s' % (parts[1], parts[0]))
raise ValueError(f"Day {parts[1]} <= {parts[0]}")
date = datetime.date(pstart.tm_year, pstart.tm_mon, pstart.tm_mday)
stop_date = datetime.date(pstop.tm_year, pstop.tm_mon, pstop.tm_mday)
@ -92,21 +96,22 @@ def parse_exceptional_closures(values):
mstart = time_pattern.match(parts[0])
mstop = time_pattern.match(parts[1])
if not mstart or not mstop:
raise ValueError('"%s" is not a valid time period' % word)
raise ValueError(f'"{word}" is not a valid time period')
hstart = datetime.time(int(mstart.group(1)), int(mstart.group(2) or 0))
hstop = datetime.time(int(mstop.group(1)), int(mstop.group(2) or 0))
if hstop <= hstart:
raise ValueError('Time %s <= %s' % (parts[1], parts[0]))
hours_periods.append({'start': hstart, 'stop': hstop})
raise ValueError(f"Time {parts[1]} <= {parts[0]}")
hours_periods.append({"start": hstart, "stop": hstop})
else:
raise ValueError('Invalid number of part in this word: "%s"' % word)
raise ValueError(f'Invalid number of part in this word: "{word}"')
if not days:
raise ValueError('No days found in value "%s"' % value)
exceptional_closures.append({'days': days, 'hours_periods': hours_periods})
raise ValueError(f'No days found in value "{value}"')
exceptional_closures.append({"days": days, "hours_periods": hours_periods})
return exceptional_closures
def parse_normal_opening_hours(values):
"""Parse normal opening hours"""
normal_opening_hours = []
for value in values:
days = []
@ -115,11 +120,11 @@ def parse_normal_opening_hours(values):
for word in words:
if not word:
continue
parts = word.split('-')
parts = word.split("-")
if len(parts) == 1:
# ex: jeudi
if word not in week_days:
raise ValueError('"%s" is not a valid week day' % word)
raise ValueError(f'"{word}" is not a valid week day')
if word not in days:
days.append(word)
elif len(parts) == 2:
@ -127,7 +132,7 @@ def parse_normal_opening_hours(values):
if parts[0] in week_days and parts[1] in week_days:
# ex: lundi-jeudi
if week_days.index(parts[1]) <= week_days.index(parts[0]):
raise ValueError('"%s" is before "%s"' % (parts[1], parts[0]))
raise ValueError(f'"{parts[1]}" is before "{parts[0]}"')
started = False
for d in week_days:
if not started and d != parts[0]:
@ -142,86 +147,773 @@ def parse_normal_opening_hours(values):
mstart = time_pattern.match(parts[0])
mstop = time_pattern.match(parts[1])
if not mstart or not mstop:
raise ValueError('"%s" is not a valid time period' % word)
raise ValueError(f'"{word}" is not a valid time period')
hstart = datetime.time(int(mstart.group(1)), int(mstart.group(2) or 0))
hstop = datetime.time(int(mstop.group(1)), int(mstop.group(2) or 0))
if hstop <= hstart:
raise ValueError('Time %s <= %s' % (parts[1], parts[0]))
hours_periods.append({'start': hstart, 'stop': hstop})
raise ValueError(f"Time {parts[1]} <= {parts[0]}")
hours_periods.append({"start": hstart, "stop": hstop})
else:
raise ValueError('Invalid number of part in this word: "%s"' % word)
raise ValueError(f'Invalid number of part in this word: "{word}"')
if not days and not hours_periods:
raise ValueError('No days or hours period found in this value: "%s"' % value)
normal_opening_hours.append({'days': days, 'hours_periods': hours_periods})
return normal_opening_hours
raise ValueError(f'No days or hours period found in this value: "{value}"')
normal_opening_hours.append({"days": days, "hours_periods": hours_periods})
for idx, noh in enumerate(normal_opening_hours):
normal_opening_hours[idx]["hours_periods"] = sorted_hours_periods(noh["hours_periods"])
return sorted_opening_hours(normal_opening_hours)
def sorted_hours_periods(hours_periods):
"""Sort hours periods"""
return sorted(hours_periods, key=lambda hp: (hp["start"], hp["stop"]))
def sorted_opening_hours(opening_hours):
"""Sort opening hours"""
return sorted(
opening_hours,
key=lambda x: (
week_days.index(x["days"][0]) if x["days"] else None,
x["hours_periods"][0]["start"] if x["hours_periods"] else datetime.datetime.min.time(),
x["hours_periods"][0]["stop"] if x["hours_periods"] else datetime.datetime.max.time(),
),
)
def its_nonworking_day(nonworking_public_holidays_values, date=None):
"""Check if is a non-working day"""
if not nonworking_public_holidays_values:
return False
date = date if date else datetime.date.today()
log.debug("its_nonworking_day(%s): values=%s", date, nonworking_public_holidays_values)
nonworking_days = nonworking_french_public_days_of_the_year(year=date.year)
for day in nonworking_public_holidays_values:
if day in nonworking_days and nonworking_days[day] == date:
log.debug("its_nonworking_day(%s): %s", date, day)
return True
return False
def its_exceptionally_closed(exceptional_closures_values, when=None, parse=True, all_day=False):
"""Check if it's exceptionally closed"""
if not exceptional_closures_values:
return False
when = when if when else datetime.datetime.now()
assert isinstance(when, (datetime.date, datetime.datetime))
when_date = when.date() if isinstance(when, datetime.datetime) else when
exceptional_closures = (
parse_exceptional_closures(exceptional_closures_values)
if parse
else exceptional_closures_values
)
log.debug("its_exceptionally_closed(%s): exceptional closures=%s", when, exceptional_closures)
for cl in exceptional_closures:
if when_date not in cl["days"]:
log.debug(
"its_exceptionally_closed(%s): %s not in days (%s)", when, when_date, cl["days"]
)
continue
if not cl["hours_periods"]:
# All day exceptional closure
return True
if all_day:
# Wanted an all day closure, ignore it
continue
for hp in cl["hours_periods"]:
if hp["start"] <= when.time() <= hp["stop"]:
return True
return False
def get_exceptional_closures_hours(exceptional_closures_values, date=None, parse=True):
"""Get exceptional closures hours of the day"""
if not exceptional_closures_values:
return []
date = date if date else datetime.date.today()
exceptional_closures = (
parse_exceptional_closures(exceptional_closures_values)
if parse
else exceptional_closures_values
)
log.debug(
"get_exceptional_closures_hours(%s): exceptional closures=%s", date, exceptional_closures
)
exceptional_closures_hours = []
for cl in exceptional_closures:
if date not in cl["days"]:
log.debug("get_exceptional_closures_hours(%s): not in days (%s)", date, cl["days"])
continue
if not cl["hours_periods"]:
log.debug(
"get_exceptional_closures_hours(%s): it's exceptionally closed all the day", date
)
return [
{
"start": datetime.datetime.min.time(),
"stop": datetime.datetime.max.time(),
}
]
exceptional_closures_hours.extend(cl["hours_periods"])
log.debug(
"get_exceptional_closures_hours(%s): exceptional closures hours=%s",
date,
exceptional_closures_hours,
)
return sorted_hours_periods(exceptional_closures_hours)
def its_normally_open(normal_opening_hours_values, when=None, parse=True, ignore_time=False):
"""Check if it's normally open"""
when = when if when else datetime.datetime.now()
if not normal_opening_hours_values:
log.debug(
"its_normally_open(%s): no normal opening hours defined, consider as opened", when
)
return True
when_weekday = week_days[when.timetuple().tm_wday]
log.debug("its_normally_open(%s): week day=%s", when, when_weekday)
normal_opening_hours = (
parse_normal_opening_hours(normal_opening_hours_values)
if parse
else normal_opening_hours_values
)
log.debug("its_normally_open(%s): normal opening hours=%s", when, normal_opening_hours)
for oh in normal_opening_hours:
if oh["days"] and when_weekday not in oh["days"]:
log.debug("its_normally_open(%s): %s not in days (%s)", when, when_weekday, oh["days"])
continue
if not oh["hours_periods"] or ignore_time:
return True
for hp in oh["hours_periods"]:
if hp["start"] <= when.time() <= hp["stop"]:
return True
log.debug("its_normally_open(%s): not in normal opening hours", when)
return False
def its_opening_day(
normal_opening_hours_values=None,
exceptional_closures_values=None,
nonworking_public_holidays_values=None,
date=None,
parse=True,
):
"""Check if it's an opening day"""
date = date if date else datetime.date.today()
if its_nonworking_day(nonworking_public_holidays_values, date=date):
return False
if its_exceptionally_closed(exceptional_closures_values, when=date, all_day=True, parse=parse):
return False
return its_normally_open(normal_opening_hours_values, when=date, parse=parse, ignore_time=True)
def is_closed(
normal_opening_hours_values=None, exceptional_closures_values=None,
nonworking_public_holidays_values=None, exceptional_closure_on_nonworking_public_days=False,
when=None, on_error='raise'
normal_opening_hours_values=None,
exceptional_closures_values=None,
nonworking_public_holidays_values=None,
exceptional_closure_on_nonworking_public_days=False,
when=None,
on_error="raise",
):
"""Check if closed"""
if not when:
when = datetime.datetime.now()
when_date = when.date()
when_time = when.time()
when_weekday = week_days[when.timetuple().tm_wday]
on_error_result = None
if on_error == 'closed':
on_error_result = {'closed': True, 'exceptional_closure': False, 'exceptional_closure_all_day': False}
elif on_error == 'opened':
on_error_result = {'closed': False, 'exceptional_closure': False, 'exceptional_closure_all_day': False}
if on_error == "closed":
on_error_result = {
"closed": True,
"exceptional_closure": False,
"exceptional_closure_all_day": False,
}
elif on_error == "opened":
on_error_result = {
"closed": False,
"exceptional_closure": False,
"exceptional_closure_all_day": False,
}
log.debug("When = %s => date = %s / time = %s / week day = %s", when, when_date, when_time, when_weekday)
if nonworking_public_holidays_values:
log.debug("Nonworking public holidays: %s", nonworking_public_holidays_values)
nonworking_days = nonworking_french_public_days_of_the_year()
for day in nonworking_public_holidays_values:
if day in nonworking_days and when_date == nonworking_days[day]:
log.debug("Non working day: %s", day)
return {'closed': True, 'exceptional_closure': exceptional_closure_on_nonworking_public_days, 'exceptional_closure_all_day': exceptional_closure_on_nonworking_public_days}
log.debug(
"When = %s => date = %s / time = %s / week day = %s",
when,
when_date,
when_time,
when_weekday,
)
# Handle non-working days
if its_nonworking_day(nonworking_public_holidays_values, date=when_date):
return {
"closed": True,
"exceptional_closure": exceptional_closure_on_nonworking_public_days,
"exceptional_closure_all_day": exceptional_closure_on_nonworking_public_days,
}
if exceptional_closures_values:
# Handle exceptional closures
try:
if its_exceptionally_closed(exceptional_closures_values, when=when):
return {
"closed": True,
"exceptional_closure": True,
"exceptional_closure_all_day": its_exceptionally_closed(
exceptional_closures_values, when=when, all_day=True
),
}
except ValueError as e:
if on_error_result is None:
log.error("Fail to parse exceptional closures", exc_info=True)
raise e from e
log.error("Fail to parse exceptional closures, consider as %s", on_error, exc_info=True)
return on_error_result
# Finally, handle normal opening hours
try:
return {
"closed": not its_normally_open(normal_opening_hours_values, when=when),
"exceptional_closure": False,
"exceptional_closure_all_day": False,
}
except ValueError as e: # pylint: disable=broad-except
if on_error_result is None:
log.error("Fail to parse normal opening hours", exc_info=True)
raise e from e
log.error("Fail to parse normal opening hours, consider as %s", on_error, exc_info=True)
return on_error_result
def next_opening_date(
normal_opening_hours_values=None,
exceptional_closures_values=None,
nonworking_public_holidays_values=None,
date=None,
max_anaylse_days=None,
parse=True,
):
"""Search for the next opening day"""
date = date if date else datetime.date.today()
max_anaylse_days = max_anaylse_days if max_anaylse_days is not None else 30
if parse:
try:
exceptional_closures = parse_exceptional_closures(exceptional_closures_values)
log.debug('Exceptional closures: %s', exceptional_closures)
except ValueError as e:
log.error("Fail to parse exceptional closures, consider as closed", exc_info=True)
if on_error_result is None:
raise e from e
return on_error_result
for cl in exceptional_closures:
if when_date not in cl['days']:
log.debug("when_date (%s) no in days (%s)", when_date, cl['days'])
continue
if not cl['hours_periods']:
# All day exceptional closure
return {'closed': True, 'exceptional_closure': True, 'exceptional_closure_all_day': True}
for hp in cl['hours_periods']:
if hp['start'] <= when_time <= hp['stop']:
return {'closed': True, 'exceptional_closure': True, 'exceptional_closure_all_day': False}
normal_opening_hours_values = (
parse_normal_opening_hours(normal_opening_hours_values)
if normal_opening_hours_values
else None
)
exceptional_closures_values = (
parse_exceptional_closures(exceptional_closures_values)
if exceptional_closures_values
else None
)
except ValueError: # pylint: disable=broad-except
log.error(
"next_opening_date(%s): fail to parse normal opening hours or exceptional closures",
date,
exc_info=True,
)
return False
added_days = 0
while added_days <= max_anaylse_days:
test_date = date + datetime.timedelta(days=added_days)
if its_opening_day(
normal_opening_hours_values=normal_opening_hours_values,
exceptional_closures_values=exceptional_closures_values,
nonworking_public_holidays_values=nonworking_public_holidays_values,
date=test_date,
parse=False,
):
return test_date
added_days += 1
log.debug(
"next_opening_date(%s): no opening day found in the next %d days", date, max_anaylse_days
)
return False
if normal_opening_hours_values:
def next_opening_hour(
normal_opening_hours_values=None,
exceptional_closures_values=None,
nonworking_public_holidays_values=None,
when=None,
max_anaylse_days=None,
parse=True,
):
"""Search for the next opening hour"""
when = when if when else datetime.datetime.now()
max_anaylse_days = max_anaylse_days if max_anaylse_days is not None else 30
if parse:
try:
normal_opening_hours = parse_normal_opening_hours(normal_opening_hours_values)
log.debug('Normal opening hours: %s', normal_opening_hours)
except ValueError as e: # pylint: disable=broad-except
log.error("Fail to parse normal opening hours, consider as closed", exc_info=True)
if on_error_result is None:
raise e from e
return on_error_result
for oh in normal_opening_hours:
if oh['days'] and when_weekday not in oh['days']:
log.debug("when_weekday (%s) no in days (%s)", when_weekday, oh['days'])
continue
if not oh['hours_periods']:
# All day opened
return {'closed': False, 'exceptional_closure': False, 'exceptional_closure_all_day': False}
for hp in oh['hours_periods']:
if hp['start'] <= when_time <= hp['stop']:
return {'closed': False, 'exceptional_closure': False, 'exceptional_closure_all_day': False}
log.debug("Not in normal opening hours => closed")
return {'closed': True, 'exceptional_closure': False, 'exceptional_closure_all_day': False}
normal_opening_hours_values = (
parse_normal_opening_hours(normal_opening_hours_values)
if normal_opening_hours_values
else None
)
exceptional_closures_values = (
parse_exceptional_closures(exceptional_closures_values)
if exceptional_closures_values
else None
)
except ValueError: # pylint: disable=broad-except
log.error(
"next_opening_hour(%s): fail to parse normal opening hours or exceptional closures",
when,
exc_info=True,
)
return False
date = next_opening_date(
normal_opening_hours_values=normal_opening_hours_values,
exceptional_closures_values=exceptional_closures_values,
nonworking_public_holidays_values=nonworking_public_holidays_values,
date=when.date(),
max_anaylse_days=max_anaylse_days,
parse=False,
)
if not date:
log.debug(
"next_opening_hour(%s): no opening day found in the next %d days",
when,
max_anaylse_days,
)
return False
log.debug("next_opening_hour(%s): next opening date=%s", when, date)
weekday = week_days[date.timetuple().tm_wday]
log.debug("next_opening_hour(%s): next opening week day=%s", when, weekday)
exceptional_closures_hours = get_exceptional_closures_hours(
exceptional_closures_values, date=date, parse=False
)
log.debug(
"next_opening_hour(%s): next opening day exceptional closures hours=%s",
when,
exceptional_closures_hours,
)
next_opening_datetime = None
exceptionally_closed = False
exceptionally_closed_all_day = False
in_opening_hours = date != when.date()
for oh in normal_opening_hours_values:
if exceptionally_closed_all_day:
break
# Not a nonworking day, not during exceptional closure and no normal opening hours defined => Opened
return {'closed': False, 'exceptional_closure': False, 'exceptional_closure_all_day': False}
if oh["days"] and weekday not in oh["days"]:
log.debug("next_opening_hour(%s): %s not in days (%s)", when, weekday, oh["days"])
continue
log.debug(
"next_opening_hour(%s): %s in days (%s), handle opening hours %s",
when,
weekday,
oh["days"],
oh["hours_periods"],
)
if not oh["hours_periods"]:
log.debug(
"next_opening_hour(%s): %s is an all day opening day, handle exceptional closures "
"hours %s to find the minimal opening time",
when,
weekday,
exceptional_closures_hours,
)
if date == when.date():
in_opening_hours = True
test_time = when.time() if when.date() == date else datetime.datetime.min.time()
for cl in exceptional_closures_hours:
if cl["start"] <= test_time < cl["stop"]:
if cl["stop"] >= datetime.datetime.max.time():
exceptionally_closed = True
exceptionally_closed_all_day = True
next_opening_datetime = None
break
test_time = cl["stop"]
else:
break
if not exceptionally_closed_all_day:
candidate_next_opening_datetime = datetime.datetime.combine(date, test_time)
next_opening_datetime = (
candidate_next_opening_datetime
if not next_opening_datetime
or candidate_next_opening_datetime < next_opening_datetime
else next_opening_datetime
)
continue
log.debug(
"next_opening_hour(%s): only opened during some hours periods (%s) on %s, find the "
"minimal starting time",
when,
oh["hours_periods"],
weekday,
)
test_time = datetime.datetime.max.time()
for hp in oh["hours_periods"]:
if date == when.date() and hp["stop"] < when.time():
log.debug(
"next_opening_hour(%s): ignore opening hours %s before specified when time %s",
when,
hp,
when.time(),
)
continue
if date == when.date() and hp["start"] <= when.time() < hp["stop"]:
in_opening_hours = True
if exceptional_closures_hours:
log.debug(
"next_opening_hour(%s): check if opening hours %s match with exceptional "
"closure hours %s",
when,
hp,
exceptional_closures_hours,
)
for cl in exceptional_closures_hours:
if cl["start"] <= hp["start"] and cl["stop"] >= hp["stop"]:
log.debug(
"next_opening_hour(%s): opening hour %s is included in exceptional "
"closure hours %s",
when,
hp,
cl,
)
exceptionally_closed = True
break
if hp["start"] < cl["start"]:
log.debug(
"next_opening_hour(%s): opening hour %s start before closure hours %s",
when,
hp,
cl,
)
test_time = hp["start"] if hp["start"] < test_time else test_time
elif cl["stop"] >= hp["start"] and cl["stop"] < hp["stop"]:
log.debug(
"next_opening_hour(%s): opening hour %s end after closure hours %s",
when,
hp,
cl,
)
test_time = cl["stop"] if cl["stop"] < test_time else test_time
elif hp["start"] < test_time:
log.debug(
"next_opening_hour(%s): no exceptional closure hours, use opening hours start "
"time %s",
when,
hp["start"],
)
test_time = hp["start"]
if test_time < datetime.datetime.max.time():
if date == when.date() and test_time < when.time():
test_time = when.time()
candidate_next_opening_datetime = datetime.datetime.combine(date, test_time)
next_opening_datetime = (
candidate_next_opening_datetime
if not next_opening_datetime
or candidate_next_opening_datetime < next_opening_datetime
else next_opening_datetime
)
if not next_opening_datetime and (
exceptionally_closed or (date == when.date() and not in_opening_hours)
):
new_max_anaylse_days = max_anaylse_days - (date - when.date()).days
if new_max_anaylse_days > 0:
log.debug(
"next_opening_hour(%s): exceptionally closed on %s, try on following %d days",
when,
date,
new_max_anaylse_days,
)
next_opening_datetime = next_opening_hour(
normal_opening_hours_values=normal_opening_hours_values,
exceptional_closures_values=exceptional_closures_values,
nonworking_public_holidays_values=nonworking_public_holidays_values,
when=datetime.datetime.combine(
date + datetime.timedelta(days=1), datetime.datetime.min.time()
),
max_anaylse_days=new_max_anaylse_days,
parse=False,
)
if not next_opening_datetime:
log.debug(
"next_opening_hour(%s): no opening hours found in next %d days", when, max_anaylse_days
)
return False
log.debug("next_opening_hour(%s): next opening hours=%s", when, next_opening_datetime)
return next_opening_datetime
def previous_opening_date(
normal_opening_hours_values=None,
exceptional_closures_values=None,
nonworking_public_holidays_values=None,
date=None,
max_anaylse_days=None,
parse=True,
):
"""Search for the previous opening day"""
date = date if date else datetime.date.today()
max_anaylse_days = max_anaylse_days if max_anaylse_days is not None else 30
if parse:
try:
normal_opening_hours_values = (
parse_normal_opening_hours(normal_opening_hours_values)
if normal_opening_hours_values
else None
)
exceptional_closures_values = (
parse_exceptional_closures(exceptional_closures_values)
if exceptional_closures_values
else None
)
except ValueError: # pylint: disable=broad-except
log.error(
"previous_opening_date(%s): fail to parse normal opening hours or exceptional "
"closures",
date,
exc_info=True,
)
return False
days = 0
while days <= max_anaylse_days:
test_date = date - datetime.timedelta(days=days)
if its_opening_day(
normal_opening_hours_values=normal_opening_hours_values,
exceptional_closures_values=exceptional_closures_values,
nonworking_public_holidays_values=nonworking_public_holidays_values,
date=test_date,
parse=False,
):
return test_date
days += 1
log.debug(
"previous_opening_date(%s): no opening day found in the next %d days",
date,
max_anaylse_days,
)
return False
def previous_opening_hour(
normal_opening_hours_values=None,
exceptional_closures_values=None,
nonworking_public_holidays_values=None,
when=None,
max_anaylse_days=None,
parse=True,
):
"""Search for the previous opening hour"""
when = when if when else datetime.datetime.now()
max_anaylse_days = max_anaylse_days if max_anaylse_days is not None else 30
if parse:
try:
normal_opening_hours_values = (
parse_normal_opening_hours(normal_opening_hours_values)
if normal_opening_hours_values
else None
)
exceptional_closures_values = (
parse_exceptional_closures(exceptional_closures_values)
if exceptional_closures_values
else None
)
except ValueError: # pylint: disable=broad-except
log.error(
"previous_opening_hour(%s): fail to parse normal opening hours or exceptional "
"closures",
when,
exc_info=True,
)
return False
date = previous_opening_date(
normal_opening_hours_values=normal_opening_hours_values,
exceptional_closures_values=exceptional_closures_values,
nonworking_public_holidays_values=nonworking_public_holidays_values,
date=when.date(),
max_anaylse_days=max_anaylse_days,
parse=False,
)
if not date:
log.debug(
"previous_opening_hour(%s): no opening day found in the previous %d days",
when,
max_anaylse_days,
)
return False
log.debug("previous_opening_hour(%s): previous opening date=%s", when, date)
weekday = week_days[date.timetuple().tm_wday]
log.debug("previous_opening_hour(%s): previous opening week day=%s", when, weekday)
exceptional_closures_hours = get_exceptional_closures_hours(
exceptional_closures_values, date=date, parse=False
)
log.debug(
"previous_opening_hour(%s): previous opening day exceptional closures hours=%s",
when,
exceptional_closures_hours,
)
previous_opening_datetime = None
exceptionally_closed = False
exceptionally_closed_all_day = False
in_opening_hours = date != when.date()
for oh in reversed(normal_opening_hours_values):
if exceptionally_closed_all_day:
break
if oh["days"] and weekday not in oh["days"]:
log.debug("previous_opening_hour(%s): %s not in days (%s)", when, weekday, oh["days"])
continue
log.debug(
"previous_opening_hour(%s): %s in days (%s), handle opening hours %s",
when,
weekday,
oh["days"],
oh["hours_periods"],
)
if not oh["hours_periods"]:
log.debug(
"previous_opening_hour(%s): %s is an all day opening day, handle exceptional "
"closures hours %s to find the maximal opening time",
when,
weekday,
exceptional_closures_hours,
)
if date == when.date():
in_opening_hours = True
test_time = when.time() if when.date() == date else datetime.datetime.max.time()
for cl in exceptional_closures_hours:
if cl["start"] <= test_time < cl["stop"]:
if cl["start"] <= datetime.datetime.min.time():
exceptionally_closed = True
exceptionally_closed_all_day = True
previous_opening_datetime = None
break
test_time = cl["start"]
else:
break
if not exceptionally_closed_all_day:
candidate_previous_opening_datetime = datetime.datetime.combine(date, test_time)
previous_opening_datetime = (
candidate_previous_opening_datetime
if not previous_opening_datetime
or candidate_previous_opening_datetime > previous_opening_datetime
else previous_opening_datetime
)
continue
log.debug(
"previous_opening_hour(%s): only opened during some hours periods (%s) on %s, find the "
"maximal opening time",
when,
oh["hours_periods"],
weekday,
)
test_time = datetime.datetime.min.time()
for hp in reversed(oh["hours_periods"]):
if date == when.date() and hp["start"] > when.time():
log.debug(
"previous_opening_hour(%s): ignore opening hours %s starting before specified "
"when time %s",
when,
hp,
when.time(),
)
continue
if date == when.date() and hp["start"] <= when.time() < hp["stop"]:
in_opening_hours = True
if exceptional_closures_hours:
log.debug(
"previous_opening_hour(%s): check if opening hours %s match with exceptional "
"closure hours %s",
when,
hp,
exceptional_closures_hours,
)
for cl in reversed(exceptional_closures_hours):
if cl["start"] <= hp["start"] and cl["stop"] >= hp["stop"]:
log.debug(
"previous_opening_hour(%s): opening hour %s is included in exceptional "
"closure hours %s",
when,
hp,
cl,
)
exceptionally_closed = True
break
if cl["stop"] < hp["stop"]:
log.debug(
"previous_opening_hour(%s): opening hour %s end after closure hours %s",
when,
hp,
cl,
)
test_time = hp["stop"] if hp["stop"] > test_time else test_time
elif cl["start"] > hp["stop"]:
log.debug(
"previous_opening_hour(%s): opening hour %s start before closure hours "
"%s",
when,
hp,
cl,
)
test_time = hp["stop"] if hp["stop"] > test_time else test_time
elif cl["stop"] >= hp["stop"] and cl["start"] > hp["start"]:
log.debug(
"previous_opening_hour(%s): opening hour %s start before closure hours "
"%s",
when,
hp,
cl,
)
test_time = cl["start"] if cl["start"] > test_time else test_time
elif hp["stop"] > test_time:
log.debug(
"previous_opening_hour(%s): no exceptional closure hours, use opening hours "
"stop time %s",
when,
hp["stop"],
)
test_time = hp["stop"]
if test_time > datetime.datetime.min.time():
if date == when.date() and test_time > when.time():
test_time = when.time()
candidate_previous_opening_datetime = datetime.datetime.combine(date, test_time)
previous_opening_datetime = (
candidate_previous_opening_datetime
if not previous_opening_datetime
or candidate_previous_opening_datetime > previous_opening_datetime
else previous_opening_datetime
)
if not previous_opening_datetime and (
exceptionally_closed or (date == when.date() and not in_opening_hours)
):
new_max_anaylse_days = max_anaylse_days - (when.date() - date).days
if new_max_anaylse_days > 0:
log.debug(
"previous_opening_hour(%s): exceptionally closed on %s, try on previous %d days",
when,
date,
new_max_anaylse_days,
)
previous_opening_datetime = previous_opening_hour(
normal_opening_hours_values=normal_opening_hours_values,
exceptional_closures_values=exceptional_closures_values,
nonworking_public_holidays_values=nonworking_public_holidays_values,
when=datetime.datetime.combine(
date - datetime.timedelta(days=1), datetime.datetime.max.time()
),
max_anaylse_days=new_max_anaylse_days,
parse=False,
)
if not previous_opening_datetime:
log.debug(
"previous_opening_hour(%s): no opening hours found in previous %d days",
when,
max_anaylse_days,
)
return False
log.debug(
"previous_opening_hour(%s): previous opening hours=%s", when, previous_opening_datetime
)
return previous_opening_datetime

View file

@ -1,5 +1,3 @@
# -*- coding: utf-8 -*-
""" Oracle client """
import logging
@ -7,110 +5,43 @@ import sys
import cx_Oracle
from mylib.db import DB, DBFailToConnect
log = logging.getLogger(__name__)
#
# Exceptions
#
class OracleDB(DB):
"""Oracle client"""
class OracleDBException(Exception):
""" That is the base exception class for all the other exceptions provided by this module. """
_dsn = None
_user = None
_pwd = None
def __init__(self, error, *args, **kwargs):
for arg, value in kwargs.items():
setattr(self, arg, value)
super().__init__(error.format(*args, **kwargs))
class OracleDBFailToConnect(OracleDBException, RuntimeError):
"""
Raised on connecting error occurred
"""
def __init__(self, dsn, user):
super().__init__(
"An error occured during Oracle database connection ({user}@{dsn})",
user=user, dsn=dsn
)
class OracleDBDuplicatedSQLParameter(OracleDBException, KeyError):
"""
Raised when trying to set a SQL query parameter
and an other parameter with the same name is already set
"""
def __init__(self, parameter_name):
super().__init__(
"Duplicated SQL parameter '{parameter_name}'",
parameter_name=parameter_name
)
class OracleDBUnsupportedWHEREClauses(OracleDBException, TypeError):
"""
Raised when trying to execute query with unsupported
WHERE clauses provided
"""
def __init__(self, where_clauses):
super().__init__(
"Unsupported WHERE clauses: {where_clauses}",
where_clauses=where_clauses
)
class OracleDBInvalidOrderByClause(OracleDBException, TypeError):
"""
Raised when trying to select on table with invalid
ORDER BY clause provided
"""
def __init__(self, order_by):
super().__init__(
"Invalid ORDER BY clause: {order_by}. Must be a string or a list of two values (ordering field name and direction)",
order_by=order_by
)
class OracleDB:
""" Oracle client """
def __init__(self, dsn, user, pwd, just_try=False):
def __init__(self, dsn, user, pwd, **kwargs):
self._dsn = dsn
self._user = user
self._pwd = pwd
self._conn = None
self.just_try = just_try
super().__init__(**kwargs)
def connect(self, exit_on_error=True):
""" Connect to Oracle server """
"""Connect to Oracle server"""
if self._conn is None:
log.info('Connect on Oracle server with DSN %s as %s', self._dsn, self._user)
log.info("Connect on Oracle server with DSN %s as %s", self._dsn, self._user)
try:
self._conn = cx_Oracle.connect(
user=self._user,
password=self._pwd,
dsn=self._dsn
)
except Exception as err:
self._conn = cx_Oracle.connect(user=self._user, password=self._pwd, dsn=self._dsn)
except cx_Oracle.Error as err:
log.fatal(
'An error occured during Oracle database connection (%s@%s).',
self._user, self._dsn, exc_info=1
"An error occurred during Oracle database connection (%s@%s).",
self._user,
self._dsn,
exc_info=1,
)
if exit_on_error:
sys.exit(1)
else:
raise OracleDBFailToConnect(self._dsn, self._user) from err
raise DBFailToConnect(f"{self._user}@{self._dsn}") from err
return True
def close(self):
""" Close connection with Oracle server (if opened) """
if self._conn:
self._conn.close()
self._conn = None
def doSQL(self, sql, params=None):
"""
Run SQL query and commit changes (rollback on error)
@ -125,32 +56,17 @@ class OracleDB:
log.debug("Just-try mode : do not really execute SQL query '%s'", sql)
return True
cursor = self._conn.cursor()
try:
log.debug(
'Run SQL query "%s" %s',
sql,
"with params = %s" % ', '.join([
"%s = %s" % (key, value)
for key, value in params.items()
]) if params else "without params"
)
if isinstance(params, dict):
cursor.execute(sql, **params)
else:
cursor.execute(sql)
self._log_query(sql, params)
with self._conn.cursor() as cursor:
if isinstance(params, dict):
cursor.execute(sql, **params)
else:
cursor.execute(sql)
self._conn.commit()
return True
except Exception:
log.error(
'Error during SQL query "%s" %s',
sql,
"with params = %s" % ', '.join([
"%s = %s" % (key, value)
for key, value in params.items()
]) if params else "without params",
exc_info=True
)
except cx_Oracle.Error:
self._log_query_exception(sql, params)
self._conn.rollback()
return False
@ -164,33 +80,20 @@ class OracleDB:
:return: List of selected rows as dict on success, False otherwise
:rtype: list, bool
"""
cursor = self._conn.cursor()
try:
log.debug(
'Run SQL SELECT query "%s" %s',
sql,
"with params = %s" % ', '.join([
"%s = %s" % (key, value)
for key, value in params.items()
]) if params else "without params"
)
if isinstance(params, dict):
cursor.execute(sql, **params)
else:
cursor.execute(sql)
cursor.rowfactory = lambda *args: dict(zip([d[0] for d in cursor.description], args))
results = cursor.fetchall()
self._log_query(sql, params)
with self._conn.cursor() as cursor:
if isinstance(params, dict):
cursor.execute(sql, **params)
else:
cursor.execute(sql)
cursor.rowfactory = lambda *args: dict(
zip([d[0] for d in cursor.description], args)
)
results = cursor.fetchall()
return results
except Exception:
log.error(
'Error during SQL query "%s" %s',
sql,
"with params = %s" % ', '.join([
"%s = %s" % (key, value)
for key, value in params.items()
]) if params else "without params",
exc_info=True
)
except cx_Oracle.Error:
self._log_query_exception(sql, params)
return False
#
@ -199,222 +102,5 @@ class OracleDB:
@staticmethod
def format_param(param):
return ':{0}'.format(param)
@classmethod
def _combine_params(cls, params, to_add=None, **kwargs):
if to_add:
assert isinstance(to_add, dict), "to_add must be a dict or None"
params = cls._combine_params(params, **to_add)
for param, value in kwargs.items():
if param in params:
raise OracleDBDuplicatedSQLParameter(param)
params[param] = value
return params
@classmethod
def _format_where_clauses(cls, where_clauses, params=None, where_op=None):
"""
Format WHERE clauses
:param where_clauses: The WHERE clauses. Could be:
- a raw SQL WHERE clause as string
- a tuple of two elements: a raw WHERE clause and its parameters as dict
- a dict of WHERE clauses with field name as key and WHERE clause value as value
- a list of any of previous valid WHERE clauses
:param params: Dict of other already set SQL query parameters (optional)
:param where_op: SQL operator used to combine WHERE clauses together (optional, default: AND)
:return: A tuple of two elements: raw SQL WHERE combined clauses and parameters on success
:rtype: string, bool
"""
if params is None:
params = dict()
if where_op is None:
where_op = 'AND'
if isinstance(where_clauses, str):
return (where_clauses, params)
if isinstance(where_clauses, tuple) and len(where_clauses) == 2 and isinstance(where_clauses[1], dict):
cls._combine_params(params, where_clauses[1])
return (where_clauses[0], params)
if isinstance(where_clauses, (list, tuple)):
sql_where_clauses = []
for where_clause in where_clauses:
sql2, params = cls._format_where_clauses(where_clause, params=params, where_op=where_op)
sql_where_clauses.append(sql2)
return (
(" %s " % where_op).join(sql_where_clauses),
params
)
if isinstance(where_clauses, dict):
sql_where_clauses = []
for field, value in where_clauses.items():
param = field
if field in params:
idx = 1
while param in params:
param = '%s_%d' % (field, idx)
idx += 1
cls._combine_params(params, {param: value})
sql_where_clauses.append(
'"{field}" = {param}'.format(field=field, param=cls.format_param(param))
)
return (
(" %s " % where_op).join(sql_where_clauses),
params
)
raise OracleDBUnsupportedWHEREClauses(where_clauses)
@classmethod
def _add_where_clauses(cls, sql, params, where_clauses, where_op=None):
"""
Add WHERE clauses to an SQL query
:param sql: The SQL query to complete
:param params: The dict of parameters of the SQL query to complete
:param where_clauses: The WHERE clause (see _format_where_clauses())
:param where_op: SQL operator used to combine WHERE clauses together (optional, default: see _format_where_clauses())
:return:
:rtype: A tuple of two elements: raw SQL WHERE combined clauses and parameters
"""
if where_clauses:
sql_where, params = cls._format_where_clauses(where_clauses, params=params, where_op=where_op)
sql += " WHERE " + sql_where
return (sql, params)
@staticmethod
def _quote_table_name(table):
""" Quote table name """
return '"{0}"'.format(
'"."'.join(
table.split('.')
)
)
def insert(self, table, values, just_try=False):
""" Run INSERT SQL query """
sql = 'INSERT INTO {0} ("{1}") VALUES ({2})'.format(
self._quote_table_name(table),
'", "'.join(values.keys()),
", ".join([
self.format_param(key)
for key in values
])
)
if just_try:
log.debug("Just-try mode: execute INSERT query: %s", sql)
return True
log.debug(sql)
if not self.doSQL(sql, params=values):
log.error("Fail to execute INSERT query (SQL: %s)", sql)
return False
return True
def update(self, table, values, where_clauses, where_op=None, just_try=False):
""" Run UPDATE SQL query """
sql = 'UPDATE {0} SET {1}'.format(
self._quote_table_name(table),
", ".join([
'"{0}" = {1}'.format(key, self.format_param(key))
for key in values
])
)
params = values
try:
sql, params = self._add_where_clauses(sql, params, where_clauses, where_op=where_op)
except (OracleDBDuplicatedSQLParameter, OracleDBUnsupportedWHEREClauses):
log.error('Fail to add WHERE clauses', exc_info=True)
return False
if just_try:
log.debug("Just-try mode: execute UPDATE query: %s", sql)
return True
log.debug(sql)
if not self.doSQL(sql, params=params):
log.error("Fail to execute UPDATE query (SQL: %s)", sql)
return False
return True
def delete(self, table, where_clauses, where_op='AND', just_try=False):
""" Run DELETE SQL query """
sql = 'DELETE FROM {0}'.format(self._quote_table_name(table))
params = dict()
try:
sql, params = self._add_where_clauses(sql, params, where_clauses, where_op=where_op)
except (OracleDBDuplicatedSQLParameter, OracleDBUnsupportedWHEREClauses):
log.error('Fail to add WHERE clauses', exc_info=True)
return False
if just_try:
log.debug("Just-try mode: execute UPDATE query: %s", sql)
return True
log.debug(sql)
if not self.doSQL(sql, params=params):
log.error("Fail to execute UPDATE query (SQL: %s)", sql)
return False
return True
def truncate(self, table, just_try=False):
""" Run TRUNCATE SQL query """
sql = 'TRUNCATE TABLE {0}'.format(self._quote_table_name(table))
if just_try:
log.debug("Just-try mode: execute TRUNCATE query: %s", sql)
return True
log.debug(sql)
if not self.doSQL(sql):
log.error("Fail to execute TRUNCATE query (SQL: %s)", sql)
return False
return True
def select(self, table, where_clauses=None, fields=None, where_op='AND', order_by=None, just_try=False):
""" Run SELECT SQL query """
sql = "SELECT "
if fields is None:
sql += "*"
elif isinstance(fields, str):
sql += '"{0}"'.format(fields)
else:
sql += '"{0}"'.format('", "'.join(fields))
sql += ' FROM {0}'.format(self._quote_table_name(table))
params = dict()
try:
sql, params = self._add_where_clauses(sql, params, where_clauses, where_op=where_op)
except (OracleDBDuplicatedSQLParameter, OracleDBUnsupportedWHEREClauses):
log.error('Fail to add WHERE clauses', exc_info=True)
return False
if order_by:
if isinstance(order_by, str):
sql += ' ORDER BY {0}'.format(order_by)
elif (
isinstance(order_by, (list, tuple)) and len(order_by) == 2
and isinstance(order_by[0], str)
and isinstance(order_by[1], str)
and order_by[1].upper() in ('ASC', 'UPPER')
):
sql += ' ORDER BY "{0}" {1}'.format(order_by[0], order_by[1].upper())
else:
raise OracleDBInvalidOrderByClause(order_by)
if just_try:
log.debug("Just-try mode: execute SELECT query : %s", sql)
return just_try
return self.doSelect(sql, params=params)
"""Format SQL query parameter for prepared query"""
return f":{param}"

View file

@ -1,10 +1,8 @@
# coding: utf8
""" Progress bar """
import logging
import progressbar
import progressbar
log = logging.getLogger(__name__)
@ -25,15 +23,15 @@ class Pbar: # pylint: disable=useless-object-inheritance
self.__count = 0
self.__pbar = progressbar.ProgressBar(
widgets=[
name + ': ',
name + ": ",
progressbar.Percentage(),
' ',
" ",
progressbar.Bar(),
' ',
" ",
progressbar.SimpleProgress(),
progressbar.ETA()
progressbar.ETA(),
],
maxval=maxval
maxval=maxval,
).start()
else:
log.info(name)
@ -49,6 +47,6 @@ class Pbar: # pylint: disable=useless-object-inheritance
self.__pbar.update(self.__count)
def finish(self):
""" Finish the progress bar """
"""Finish the progress bar"""
if self.__pbar:
self.__pbar.finish()

View file

@ -1,5 +1,3 @@
# -*- coding: utf-8 -*-
""" PostgreSQL client """
import datetime
@ -7,128 +5,75 @@ import logging
import sys
import psycopg2
from psycopg2.extras import RealDictCursor
from mylib.db import DB, DBFailToConnect
log = logging.getLogger(__name__)
#
# Exceptions
#
class PgDB(DB):
"""PostgreSQL client"""
class PgDBException(Exception):
""" That is the base exception class for all the other exceptions provided by this module. """
_host = None
_user = None
_pwd = None
_db = None
def __init__(self, error, *args, **kwargs):
for arg, value in kwargs.items():
setattr(self, arg, value)
super().__init__(error.format(*args, **kwargs))
date_format = "%Y-%m-%d"
datetime_format = "%Y-%m-%d %H:%M:%S"
class PgDBFailToConnect(PgDBException, RuntimeError):
"""
Raised on connecting error occurred
"""
def __init__(self, host, user, db):
super().__init__(
"An error occured during Postgresql database connection ({user}@{host}, database={db})",
user=user, host=host, db=db
)
class PgDBDuplicatedSQLParameter(PgDBException, KeyError):
"""
Raised when trying to set a SQL query parameter
and an other parameter with the same name is already set
"""
def __init__(self, parameter_name):
super().__init__(
"Duplicated SQL parameter '{parameter_name}'",
parameter_name=parameter_name
)
class PgDBUnsupportedWHEREClauses(PgDBException, TypeError):
"""
Raised when trying to execute query with unsupported
WHERE clauses provided
"""
def __init__(self, where_clauses):
super().__init__(
"Unsupported WHERE clauses: {where_clauses}",
where_clauses=where_clauses
)
class PgDBInvalidOrderByClause(PgDBException, TypeError):
"""
Raised when trying to select on table with invalid
ORDER BY clause provided
"""
def __init__(self, order_by):
super().__init__(
"Invalid ORDER BY clause: {order_by}. Must be a string or a list of two values (ordering field name and direction)",
order_by=order_by
)
class PgDB:
""" PostgreSQL client """
date_format = '%Y-%m-%d'
datetime_format = '%Y-%m-%d %H:%M:%S'
def __init__(self, host, user, pwd, db, just_try=False):
def __init__(self, host, user, pwd, db, **kwargs):
self._host = host
self._user = user
self._pwd = pwd
self._db = db
self._conn = None
self.just_try = just_try
super().__init__(**kwargs)
def connect(self, exit_on_error=True):
""" Connect to PostgreSQL server """
"""Connect to PostgreSQL server"""
if self._conn is None:
try:
log.info(
'Connect on PostgreSQL server %s as %s on database %s',
self._host, self._user, self._db)
self._conn = psycopg2.connect(
dbname=self._db,
user=self._user,
host=self._host,
password=self._pwd
"Connect on PostgreSQL server %s as %s on database %s",
self._host,
self._user,
self._db,
)
except Exception as err:
self._conn = psycopg2.connect(
dbname=self._db, user=self._user, host=self._host, password=self._pwd
)
except psycopg2.Error as err:
log.fatal(
'An error occured during Postgresql database connection (%s@%s, database=%s).',
self._user, self._host, self._db, exc_info=1
"An error occurred during Postgresql database connection (%s@%s, database=%s).",
self._user,
self._host,
self._db,
exc_info=1,
)
if exit_on_error:
sys.exit(1)
else:
raise PgDBFailToConnect(self._host, self._user, self._db) from err
raise DBFailToConnect(f"{self._user}@{self._host}:{self._db}") from err
return True
def close(self):
""" Close connection with PostgreSQL server (if opened) """
"""Close connection with PostgreSQL server (if opened)"""
if self._conn:
self._conn.close()
self._conn = None
def setEncoding(self, enc):
""" Set connection encoding """
"""Set connection encoding"""
if self._conn:
try:
self._conn.set_client_encoding(enc)
return True
except Exception:
except psycopg2.Error:
log.error(
'An error occured setting Postgresql database connection encoding to "%s"',
enc, exc_info=1
'An error occurred setting Postgresql database connection encoding to "%s"',
enc,
exc_info=1,
)
return False
@ -148,30 +93,15 @@ class PgDB:
cursor = self._conn.cursor()
try:
log.debug(
'Run SQL query "%s" %s',
sql,
"with params = %s" % ', '.join([
"%s = %s" % (key, value)
for key, value in params.items()
]) if params else "without params"
)
self._log_query(sql, params)
if params is None:
cursor.execute(sql)
else:
cursor.execute(sql, params)
self._conn.commit()
return True
except Exception:
log.error(
'Error during SQL query "%s" %s',
sql,
"with params = %s" % ', '.join([
"%s = %s" % (key, value)
for key, value in params.items()
]) if params else "without params",
exc_info=True
)
except psycopg2.Error:
self._log_query_exception(sql, params)
self._conn.rollback()
return False
@ -185,266 +115,25 @@ class PgDB:
:return: List of selected rows as dict on success, False otherwise
:rtype: list, bool
"""
cursor = self._conn.cursor()
cursor = self._conn.cursor(cursor_factory=RealDictCursor)
try:
log.debug(
'Run SQL SELECT query "%s" %s',
sql,
"with params = %s" % ', '.join([
"%s = %s" % (key, value)
for key, value in params.items()
]) if params else "without params"
)
self._log_query(sql, params)
cursor.execute(sql, params)
results = cursor.fetchall()
return results
except Exception:
log.error(
'Error during SQL query "%s" %s',
sql,
"with params = %s" % ', '.join([
"%s = %s" % (key, value)
for key, value in params.items()
]) if params else "without params",
exc_info=True
)
return list(map(dict, results))
except psycopg2.Error:
self._log_query_exception(sql, params)
return False
#
# SQL helpers
#
@staticmethod
def format_param(param):
return '%({0})s'.format(param)
@classmethod
def _combine_params(cls, params, to_add=None, **kwargs):
if to_add:
assert isinstance(to_add, dict), "to_add must be a dict or None"
params = cls._combine_params(params, **to_add)
for param, value in kwargs.items():
if param in params:
raise PgDBDuplicatedSQLParameter(param)
params[param] = value
return params
@classmethod
def _format_where_clauses(cls, where_clauses, params=None, where_op=None):
"""
Format WHERE clauses
:param where_clauses: The WHERE clauses. Could be:
- a raw SQL WHERE clause as string
- a tuple of two elements: a raw WHERE clause and its parameters as dict
- a dict of WHERE clauses with field name as key and WHERE clause value as value
- a list of any of previous valid WHERE clauses
:param params: Dict of other already set SQL query parameters (optional)
:param where_op: SQL operator used to combine WHERE clauses together (optional, default: AND)
:return: A tuple of two elements: raw SQL WHERE combined clauses and parameters on success
:rtype: string, bool
"""
if params is None:
params = dict()
if where_op is None:
where_op = 'AND'
if isinstance(where_clauses, str):
return (where_clauses, params)
if isinstance(where_clauses, tuple) and len(where_clauses) == 2 and isinstance(where_clauses[1], dict):
cls._combine_params(params, where_clauses[1])
return (where_clauses[0], params)
if isinstance(where_clauses, (list, tuple)):
sql_where_clauses = []
for where_clause in where_clauses:
sql2, params = cls._format_where_clauses(where_clause, params=params, where_op=where_op)
sql_where_clauses.append(sql2)
return (
(" %s " % where_op).join(sql_where_clauses),
params
)
if isinstance(where_clauses, dict):
sql_where_clauses = []
for field, value in where_clauses.items():
param = field
if field in params:
idx = 1
while param in params:
param = '%s_%d' % (field, idx)
idx += 1
cls._combine_params(params, {param: value})
sql_where_clauses.append(
'"{field}" = {param}'.format(field=field, param=cls.format_param(param))
)
return (
(" %s " % where_op).join(sql_where_clauses),
params
)
raise PgDBUnsupportedWHEREClauses(where_clauses)
@classmethod
def _add_where_clauses(cls, sql, params, where_clauses, where_op=None):
"""
Add WHERE clauses to an SQL query
:param sql: The SQL query to complete
:param params: The dict of parameters of the SQL query to complete
:param where_clauses: The WHERE clause (see _format_where_clauses())
:param where_op: SQL operator used to combine WHERE clauses together (optional, default: see _format_where_clauses())
:return:
:rtype: A tuple of two elements: raw SQL WHERE combined clauses and parameters
"""
if where_clauses:
sql_where, params = cls._format_where_clauses(where_clauses, params=params, where_op=where_op)
sql += " WHERE " + sql_where
return (sql, params)
@staticmethod
def _quote_table_name(table):
""" Quote table name """
return '"{0}"'.format(
'"."'.join(
table.split('.')
)
)
def insert(self, table, values, just_try=False):
""" Run INSERT SQL query """
sql = 'INSERT INTO {0} ("{1}") VALUES ({2})'.format(
self._quote_table_name(table),
'", "'.join(values.keys()),
", ".join([
self.format_param(key)
for key in values
])
)
if just_try:
log.debug("Just-try mode: execute INSERT query: %s", sql)
return True
log.debug(sql)
if not self.doSQL(sql, params=values):
log.error("Fail to execute INSERT query (SQL: %s)", sql)
return False
return True
def update(self, table, values, where_clauses, where_op=None, just_try=False):
""" Run UPDATE SQL query """
sql = 'UPDATE {0} SET {1}'.format(
self._quote_table_name(table),
", ".join([
'"{0}" = {1}'.format(key, self.format_param(key))
for key in values
])
)
params = values
try:
sql, params = self._add_where_clauses(sql, params, where_clauses, where_op=where_op)
except (PgDBDuplicatedSQLParameter, PgDBUnsupportedWHEREClauses):
log.error('Fail to add WHERE clauses', exc_info=True)
return False
if just_try:
log.debug("Just-try mode: execute UPDATE query: %s", sql)
return True
log.debug(sql)
if not self.doSQL(sql, params=params):
log.error("Fail to execute UPDATE query (SQL: %s)", sql)
return False
return True
def delete(self, table, where_clauses, where_op='AND', just_try=False):
""" Run DELETE SQL query """
sql = 'DELETE FROM {0}'.format(self._quote_table_name(table))
params = dict()
try:
sql, params = self._add_where_clauses(sql, params, where_clauses, where_op=where_op)
except (PgDBDuplicatedSQLParameter, PgDBUnsupportedWHEREClauses):
log.error('Fail to add WHERE clauses', exc_info=True)
return False
if just_try:
log.debug("Just-try mode: execute UPDATE query: %s", sql)
return True
log.debug(sql)
if not self.doSQL(sql, params=params):
log.error("Fail to execute UPDATE query (SQL: %s)", sql)
return False
return True
def truncate(self, table, just_try=False):
""" Run TRUNCATE SQL query """
sql = 'TRUNCATE {0}'.format(self._quote_table_name(table))
if just_try:
log.debug("Just-try mode: execute TRUNCATE query: %s", sql)
return True
log.debug(sql)
if not self.doSQL(sql):
log.error("Fail to execute TRUNCATE query (SQL: %s)", sql)
return False
return True
def select(self, table, where_clauses=None, fields=None, where_op='AND', order_by=None, just_try=False):
""" Run SELECT SQL query """
sql = "SELECT "
if fields is None:
sql += "*"
elif isinstance(fields, str):
sql += '"{0}"'.format(fields)
else:
sql += '"{0}"'.format('", "'.join(fields))
sql += ' FROM {0}'.format(self._quote_table_name(table))
params = dict()
try:
sql, params = self._add_where_clauses(sql, params, where_clauses, where_op=where_op)
except (PgDBDuplicatedSQLParameter, PgDBUnsupportedWHEREClauses):
log.error('Fail to add WHERE clauses', exc_info=True)
return False
if order_by:
if isinstance(order_by, str):
sql += ' ORDER BY {0}'.format(order_by)
elif (
isinstance(order_by, (list, tuple)) and len(order_by) == 2
and isinstance(order_by[0], str)
and isinstance(order_by[1], str)
and order_by[1].upper() in ('ASC', 'UPPER')
):
sql += ' ORDER BY "{0}" {1}'.format(order_by[0], order_by[1].upper())
else:
raise PgDBInvalidOrderByClause(order_by)
if just_try:
log.debug("Just-try mode: execute SELECT query : %s", sql)
return just_try
return self.doSelect(sql, params=params)
#
# Depreated helpers
# Deprecated helpers
#
@classmethod
def _quote_value(cls, value):
""" Quote a value for SQL query """
"""Quote a value for SQL query"""
if value is None:
return 'NULL'
return "NULL"
if isinstance(value, (int, float)):
return str(value)
@ -454,26 +143,27 @@ class PgDB:
elif isinstance(value, datetime.date):
value = cls._format_date(value)
return "'%s'" % value.replace("'", "''")
# pylint: disable=consider-using-f-string
return "'{}'".format(value.replace("'", "''"))
@classmethod
def _format_datetime(cls, value):
""" Format datetime object as string """
"""Format datetime object as string"""
assert isinstance(value, datetime.datetime)
return value.strftime(cls.datetime_format)
@classmethod
def _format_date(cls, value):
""" Format date object as string """
"""Format date object as string"""
assert isinstance(value, (datetime.date, datetime.datetime))
return value.strftime(cls.date_format)
@classmethod
def time2datetime(cls, time):
""" Convert timestamp to datetime string """
"""Convert timestamp to datetime string"""
return cls._format_datetime(datetime.datetime.fromtimestamp(int(time)))
@classmethod
def time2date(cls, time):
""" Convert timestamp to date string """
"""Convert timestamp to date string"""
return cls._format_date(datetime.date.fromtimestamp(int(time)))

View file

@ -1,69 +1,155 @@
# coding: utf8
""" Report """
import atexit
import logging
from mylib.config import ConfigurableObject, StringOption
from mylib.email import EmailClient
log = logging.getLogger(__name__)
class Report: # pylint: disable=useless-object-inheritance
""" Logging report """
class Report(ConfigurableObject): # pylint: disable=useless-object-inheritance
"""Logging report"""
_config_name = "report"
_config_comment = "Email report"
_defaults = {
"recipient": None,
"subject": "Report",
"loglevel": "WARNING",
"logformat": "%(asctime)s - %(levelname)s - %(message)s",
"just_try": False,
}
content = []
handler = None
formatter = None
subject = None
rcpt_to = None
email_client = None
def __init__(self, loglevel=logging.WARNING, logformat='%(asctime)s - %(levelname)s - %(message)s',
subject=None, rcpt_to=None, email_client=None):
def __init__(
self,
email_client=None,
add_logging_handler=False,
send_at_exit=None,
initialize=True,
**kwargs,
):
super().__init__(**kwargs)
self.email_client = email_client
self.add_logging_handler = add_logging_handler
self._send_at_exit = send_at_exit
self._attachment_files = []
self._attachment_payloads = []
if initialize:
self.initialize()
def configure(self, **kwargs): # pylint: disable=arguments-differ
"""Configure options on registered mylib.Config object"""
section = super().configure(
just_try_help=kwargs.pop("just_try_help", "Just-try mode: do not really send report"),
**kwargs,
)
section.add_option(StringOption, "recipient", comment="Report recipient email address")
section.add_option(
StringOption,
"subject",
default=self._defaults["subject"],
comment="Report email subject",
)
section.add_option(
StringOption,
"loglevel",
default=self._defaults["loglevel"],
comment='Report log level (as accept by python logging, for instance "INFO")',
)
section.add_option(
StringOption,
"logformat",
default=self._defaults["logformat"],
comment='Report log level (as accept by python logging, for instance "INFO")',
)
if not self.email_client:
self.email_client = EmailClient(config=self._config)
self.email_client.configure()
return section
def initialize(self, loaded_config=None):
"""Configuration initialized hook"""
super().initialize(loaded_config=loaded_config)
self.handler = logging.StreamHandler(self)
self.handler.setLevel(loglevel)
self.formatter = logging.Formatter(logformat)
loglevel = self._get_option("loglevel").upper()
assert hasattr(logging, loglevel), f"Invalid report loglevel {loglevel}"
self.handler.setLevel(getattr(logging, loglevel))
self.formatter = logging.Formatter(self._get_option("logformat"))
self.handler.setFormatter(self.formatter)
self.subject = subject
self.rcpt_to = rcpt_to
self.email_client = email_client
if self.add_logging_handler:
logging.getLogger().addHandler(self.handler)
if self._send_at_exit:
self.send_at_exit()
def get_handler(self):
""" Retreive logging handler """
"""Retrieve logging handler"""
return self.handler
def write(self, msg):
""" Write a message """
"""Write a message"""
self.content.append(msg)
def get_content(self):
""" Read the report content """
"""Read the report content"""
return "".join(self.content)
def send(self, subject=None, rcpt_to=None, email_client=None, just_try=False):
""" Send report using an EmailClient """
if not self.rcpt_to and not rcpt_to:
log.debug('No report recipient, do not send report')
def add_attachment_file(self, filepath):
"""Add attachment file"""
self._attachment_files.append(filepath)
def add_attachment_payload(self, payload):
"""Add attachment payload"""
self._attachment_payloads.append(payload)
def send(self, subject=None, rcpt_to=None, email_client=None, just_try=None):
"""Send report using an EmailClient"""
if rcpt_to is None:
rcpt_to = self._get_option("recipient")
if not rcpt_to:
log.debug("No report recipient, do not send report")
return True
assert self.subject or subject, "You must provide report subject using Report.__init__ or Report.send"
assert self.email_client or email_client, "You must provide email client using Report.__init__ or Report.send"
if subject is None:
subject = self._get_option("subject")
assert subject, "You must provide report subject using Report.__init__ or Report.send"
if email_client is None:
email_client = self.email_client
assert email_client, (
"You must provide an email client __init__(), send() or send_at_exit() methods argument"
" email_client"
)
content = self.get_content()
if not content:
log.debug('Report is empty, do not send it')
log.debug("Report is empty, do not send it")
return True
msg = email_client.forge_message(
self.rcpt_to or rcpt_to,
subject=self.subject or subject,
text_body=content
rcpt_to,
subject=subject,
text_body=content,
attachment_files=self._attachment_files,
attachment_payloads=self._attachment_payloads,
)
if email_client.send(self.rcpt_to or rcpt_to, msg=msg, just_try=just_try):
log.debug('Report sent to %s', self.rcpt_to or rcpt_to)
if email_client.send(
rcpt_to, msg=msg, just_try=just_try if just_try is not None else self._just_try
):
log.debug("Report sent to %s", rcpt_to)
return True
log.error('Fail to send report to %s', self.rcpt_to or rcpt_to)
log.error("Fail to send report to %s", rcpt_to)
return False
def send_at_exit(self, **kwargs):
""" Send report at exit """
"""Send report at exit"""
atexit.register(self.send, **kwargs)

View file

@ -0,0 +1 @@
<strong>Just a test email.</strong> <small>(sent at ${sent_date})</small>

View file

@ -0,0 +1 @@
Test email

View file

@ -0,0 +1 @@
Just a test email sent at ${sent_date}.

View file

@ -1,78 +1,102 @@
# -*- coding: utf-8 -*-
""" Test Email client """
import datetime
import getpass
import logging
import os
import sys
import getpass
from mako.template import Template as MakoTemplate
from mylib.scripts.helpers import add_email_opts, get_opts_parser, init_email_client, init_logging
from mylib.scripts.helpers import get_opts_parser, add_email_opts
from mylib.scripts.helpers import init_logging, init_email_client
log = logging.getLogger('mylib.scripts.email_test')
log = logging.getLogger("mylib.scripts.email_test")
def main(argv=None): # pylint: disable=too-many-locals,too-many-statements
""" Script main """
"""Script main"""
if argv is None:
argv = sys.argv[1:]
# Options parser
parser = get_opts_parser(just_try=True)
add_email_opts(parser)
add_email_opts(
parser,
templates_path=os.path.join(os.path.dirname(os.path.realpath(__file__)), "email_templates"),
)
test_opts = parser.add_argument_group('Test email options')
test_opts = parser.add_argument_group("Test email options")
test_opts.add_argument(
'-t', '--to',
"-t",
"--to",
action="store",
type=str,
dest="test_to",
help="Test email recipient",
help="Test email recipient(s)",
nargs="+",
)
test_opts.add_argument(
'-m', '--mako',
"-T",
"--template",
action="store_true",
dest="template",
help="Template name to send (default: test)",
default="test",
)
test_opts.add_argument(
"-m",
"--mako",
action="store_true",
dest="test_mako",
help="Test mako templating",
)
test_opts.add_argument(
"--cc",
action="store",
type=str,
dest="test_cc",
help="Test CC email recipient(s)",
nargs="+",
)
test_opts.add_argument(
"--bcc",
action="store",
type=str,
dest="test_bcc",
help="Test BCC email recipient(s)",
nargs="+",
)
options = parser.parse_args()
if not options.test_to:
parser.error('You must specify test email recipient using -t/--to parameter')
parser.error("You must specify at least one test email recipient using -t/--to parameter")
sys.exit(1)
# Initialize logs
init_logging(options, 'Test EmailClient')
init_logging(options, "Test EmailClient")
if options.email_smtp_user and not options.email_smtp_password:
options.email_smtp_password = getpass.getpass('Please enter SMTP password: ')
options.email_smtp_password = getpass.getpass("Please enter SMTP password: ")
email_client = init_email_client(
options,
templates=dict(
test=dict(
subject="Test email",
text=(
"Just a test email sent at {sent_date}." if not options.test_mako else
MakoTemplate("Just a test email sent at ${sent_date}.")
),
html=(
"<strong>Just a test email.</strong> <small>(sent at {sent_date})</small>" if not options.test_mako else
MakoTemplate("<strong>Just a test email.</strong> <small>(sent at ${sent_date})</small>")
)
)
)
email_client = init_email_client(options)
log.info(
"Send a test email to %s (CC: %s / BCC: %s)",
", ".join(options.test_to),
", ".join(options.test_cc) if options.test_cc else None,
", ".join(options.test_bcc) if options.test_bcc else None,
)
log.info('Send a test email to %s', options.test_to)
if email_client.send(options.test_to, template='test', sent_date=datetime.datetime.now()):
log.info('Test email sent')
if email_client.send(
options.test_to,
cc=options.test_cc,
bcc=options.test_bcc,
template="test",
sent_date=datetime.datetime.now(),
):
log.info("Test email sent")
sys.exit(0)
log.error('Fail to send test email')
log.error("Fail to send test email")
sys.exit(1)

View file

@ -1,72 +1,93 @@
# -*- coding: utf-8 -*-
""" Test Email client using mylib.config.Config for configuration """
import datetime
import logging
import os
import sys
from mako.template import Template as MakoTemplate
from mylib.config import Config
from mylib.email import EmailClient
log = logging.getLogger(__name__)
def main(argv=None): # pylint: disable=too-many-locals,too-many-statements
""" Script main """
"""Script main"""
if argv is None:
argv = sys.argv[1:]
config = Config(__doc__, __name__.replace('.', '_'))
config = Config(__doc__, __name__.replace(".", "_"))
email_client = EmailClient(config=config)
email_client.configure()
email_client.set_default(
"templates_path",
os.path.join(os.path.dirname(os.path.realpath(__file__)), "email_templates"),
)
email_client.configure(just_try=True)
# Options parser
parser = config.get_arguments_parser(description=__doc__)
test_opts = parser.add_argument_group('Test email options')
test_opts = parser.add_argument_group("Test email options")
test_opts.add_argument(
'-t', '--to',
"-t",
"--to",
action="store",
type=str,
dest="test_to",
help="Test email recipient",
help="Test email recipient(s)",
nargs="+",
)
test_opts.add_argument(
'-m', '--mako',
"-T",
"--template",
action="store_true",
dest="template",
help="Template name to send (default: test)",
default="test",
)
test_opts.add_argument(
"-m",
"--mako",
action="store_true",
dest="test_mako",
help="Test mako templating",
)
test_opts.add_argument(
"--cc",
action="store",
type=str,
dest="test_cc",
help="Test CC email recipient(s)",
nargs="+",
)
test_opts.add_argument(
"--bcc",
action="store",
type=str,
dest="test_bcc",
help="Test BCC email recipient(s)",
nargs="+",
)
options = config.parse_arguments_options()
if not options.test_to:
parser.error('You must specify test email recipient using -t/--to parameter')
parser.error("You must specify at least one test email recipient using -t/--to parameter")
sys.exit(1)
email_client.templates = dict(
test=dict(
subject="Test email",
text=(
"Just a test email sent at {sent_date}." if not options.test_mako else
MakoTemplate("Just a test email sent at ${sent_date}.")
),
html=(
"<strong>Just a test email.</strong> <small>(sent at {sent_date})</small>" if not options.test_mako else
MakoTemplate("<strong>Just a test email.</strong> <small>(sent at ${sent_date})</small>")
)
)
)
logging.info('Send a test email to %s', options.test_to)
if email_client.send(options.test_to, template='test', sent_date=datetime.datetime.now()):
logging.info('Test email sent')
if email_client.send(
options.test_to,
cc=options.test_cc,
bcc=options.test_bcc,
template="test",
sent_date=datetime.datetime.now(),
):
logging.info("Test email sent")
sys.exit(0)
logging.error('Fail to send test email')
logging.error("Fail to send test email")
sys.exit(1)

View file

@ -1,10 +1,9 @@
# coding: utf8
""" Scripts helpers """
import argparse
import getpass
import logging
import os.path
import socket
import sys
@ -12,8 +11,8 @@ log = logging.getLogger(__name__)
def init_logging(options, name, report=None):
""" Initialize logging from calling script options """
logformat = '%(asctime)s - ' + name + ' - %(levelname)s - %(message)s'
"""Initialize logging from calling script options"""
logformat = f"%(asctime)s - {name} - %(levelname)s - %(message)s"
if options.debug:
loglevel = logging.DEBUG
elif options.verbose:
@ -32,199 +31,270 @@ def init_logging(options, name, report=None):
def get_default_opt_value(config, default_config, key):
""" Retreive default option value from config or default config dictionaries """
"""Retrieve default option value from config or default config dictionaries"""
if config and key in config:
return config[key]
return default_config.get(key)
def get_opts_parser(desc=None, just_try=False, just_one=False, progress=False, config=None):
""" Retrieve options parser """
default_config = dict(logfile=None)
def get_opts_parser(
desc=None, just_try=False, just_one=False, progress=False, config=None, **kwargs
):
"""Retrieve options parser"""
default_config = {"logfile": None}
parser = argparse.ArgumentParser(description=desc)
parser = argparse.ArgumentParser(description=desc, **kwargs)
parser.add_argument(
'-v', '--verbose',
action="store_true",
dest="verbose",
help="Enable verbose mode"
"-v", "--verbose", action="store_true", dest="verbose", help="Enable verbose mode"
)
parser.add_argument(
'-d', '--debug',
action="store_true",
dest="debug",
help="Enable debug mode"
"-d", "--debug", action="store_true", dest="debug", help="Enable debug mode"
)
parser.add_argument(
'-l', '--log-file',
"-l",
"--log-file",
action="store",
type=str,
dest="logfile",
help="Log file path (default: %s)" % get_default_opt_value(config, default_config, 'logfile'),
default=get_default_opt_value(config, default_config, 'logfile')
help=f'Log file path (default: {get_default_opt_value(config, default_config, "logfile")})',
default=get_default_opt_value(config, default_config, "logfile"),
)
parser.add_argument(
'-C', '--console',
"-C",
"--console",
action="store_true",
dest="console",
help="Always log on console (even if log file is configured)"
help="Always log on console (even if log file is configured)",
)
if just_try:
parser.add_argument(
'-j', '--just-try',
action="store_true",
dest="just_try",
help="Enable just-try mode"
"-j", "--just-try", action="store_true", dest="just_try", help="Enable just-try mode"
)
if just_one:
parser.add_argument(
'-J', '--just-one',
action="store_true",
dest="just_one",
help="Enable just-one mode"
"-J", "--just-one", action="store_true", dest="just_one", help="Enable just-one mode"
)
if progress:
parser.add_argument(
'-p', '--progress',
action="store_true",
dest="progress",
help="Enable progress bar"
"-p", "--progress", action="store_true", dest="progress", help="Enable progress bar"
)
return parser
def add_email_opts(parser, config=None):
""" Add email options """
email_opts = parser.add_argument_group('Email options')
def add_email_opts(parser, config=None, **defaults):
"""Add email options"""
email_opts = parser.add_argument_group("Email options")
default_config = dict(
smtp_host="127.0.0.1", smtp_port=25, smtp_ssl=False, smtp_tls=False, smtp_user=None,
smtp_password=None, smtp_debug=False, email_encoding=sys.getdefaultencoding(),
sender_name=getpass.getuser(), sender_email=getpass.getuser() + '@' + socket.gethostname(),
catch_all=False
)
default_config = {
"smtp_host": "127.0.0.1",
"smtp_port": 25,
"smtp_ssl": False,
"smtp_tls": False,
"smtp_user": None,
"smtp_password": None,
"smtp_debug": False,
"email_encoding": sys.getdefaultencoding(),
"sender_name": getpass.getuser(),
"sender_email": f"{getpass.getuser()}@{socket.gethostname()}",
"catch_all": False,
"templates_path": None,
}
default_config.update(defaults)
email_opts.add_argument(
'--smtp-host',
"--smtp-host",
action="store",
type=str,
dest="email_smtp_host",
help="SMTP host (default: %s)" % get_default_opt_value(config, default_config, 'smtp_host'),
default=get_default_opt_value(config, default_config, 'smtp_host')
help=f'SMTP host (default: {get_default_opt_value(config, default_config, "smtp_host")})',
default=get_default_opt_value(config, default_config, "smtp_host"),
)
email_opts.add_argument(
'--smtp-port',
"--smtp-port",
action="store",
type=int,
dest="email_smtp_port",
help="SMTP port (default: %s)" % get_default_opt_value(config, default_config, 'smtp_port'),
default=get_default_opt_value(config, default_config, 'smtp_port')
help=f'SMTP port (default: {get_default_opt_value(config, default_config, "smtp_port")})',
default=get_default_opt_value(config, default_config, "smtp_port"),
)
email_opts.add_argument(
'--smtp-ssl',
"--smtp-ssl",
action="store_true",
dest="email_smtp_ssl",
help="Use SSL (default: %s)" % get_default_opt_value(config, default_config, 'smtp_ssl'),
default=get_default_opt_value(config, default_config, 'smtp_ssl')
help=f'Use SSL (default: {get_default_opt_value(config, default_config, "smtp_ssl")})',
default=get_default_opt_value(config, default_config, "smtp_ssl"),
)
email_opts.add_argument(
'--smtp-tls',
"--smtp-tls",
action="store_true",
dest="email_smtp_tls",
help="Use TLS (default: %s)" % get_default_opt_value(config, default_config, 'smtp_tls'),
default=get_default_opt_value(config, default_config, 'smtp_tls')
help=f'Use TLS (default: {get_default_opt_value(config, default_config, "smtp_tls")})',
default=get_default_opt_value(config, default_config, "smtp_tls"),
)
email_opts.add_argument(
'--smtp-user',
"--smtp-user",
action="store",
type=str,
dest="email_smtp_user",
help="SMTP username (default: %s)" % get_default_opt_value(config, default_config, 'smtp_user'),
default=get_default_opt_value(config, default_config, 'smtp_user')
help=(
f'SMTP username (default: {get_default_opt_value(config, default_config, "smtp_user")})'
),
default=get_default_opt_value(config, default_config, "smtp_user"),
)
email_opts.add_argument(
'--smtp-password',
"--smtp-password",
action="store",
type=str,
dest="email_smtp_password",
help="SMTP password (default: %s)" % get_default_opt_value(config, default_config, 'smtp_password'),
default=get_default_opt_value(config, default_config, 'smtp_password')
help=(
"SMTP password (default:"
f' {get_default_opt_value(config, default_config, "smtp_password")})'
),
default=get_default_opt_value(config, default_config, "smtp_password"),
)
email_opts.add_argument(
'--smtp-debug',
"--smtp-debug",
action="store_true",
dest="email_smtp_debug",
help="Debug SMTP connection (default: %s)" % get_default_opt_value(config, default_config, 'smtp_debug'),
default=get_default_opt_value(config, default_config, 'smtp_debug')
help=(
"Debug SMTP connection (default:"
f' {get_default_opt_value(config, default_config, "smtp_debug")})'
),
default=get_default_opt_value(config, default_config, "smtp_debug"),
)
email_opts.add_argument(
'--email-encoding',
"--email-encoding",
action="store",
type=str,
dest="email_encoding",
help="SMTP encoding (default: %s)" % get_default_opt_value(config, default_config, 'email_encoding'),
default=get_default_opt_value(config, default_config, 'email_encoding')
help=(
"SMTP encoding (default:"
f' {get_default_opt_value(config, default_config, "email_encoding")})'
),
default=get_default_opt_value(config, default_config, "email_encoding"),
)
email_opts.add_argument(
'--sender-name',
"--sender-name",
action="store",
type=str,
dest="email_sender_name",
help="Sender name (default: %s)" % get_default_opt_value(config, default_config, 'sender_name'),
default=get_default_opt_value(config, default_config, 'sender_name')
help=(
f'Sender name (default: {get_default_opt_value(config, default_config, "sender_name")})'
),
default=get_default_opt_value(config, default_config, "sender_name"),
)
email_opts.add_argument(
'--sender-email',
"--sender-email",
action="store",
type=str,
dest="email_sender_email",
help="Sender email (default: %s)" % get_default_opt_value(config, default_config, 'sender_email'),
default=get_default_opt_value(config, default_config, 'sender_email')
help=(
"Sender email (default:"
f' {get_default_opt_value(config, default_config, "sender_email")})'
),
default=get_default_opt_value(config, default_config, "sender_email"),
)
email_opts.add_argument(
'--catch-all',
"--catch-all",
action="store",
type=str,
dest="email_catch_all",
help="Catch all sent email: specify catch recipient email address (default: %s)" % get_default_opt_value(config, default_config, 'catch_all'),
default=get_default_opt_value(config, default_config, 'catch_all')
help=(
"Catch all sent email: specify catch recipient email address "
f'(default: {get_default_opt_value(config, default_config, "catch_all")})'
),
default=get_default_opt_value(config, default_config, "catch_all"),
)
email_opts.add_argument(
"--templates-path",
action="store",
type=str,
dest="email_templates_path",
help=(
"Load templates from specify directory "
f'(default: {get_default_opt_value(config, default_config, "templates_path")})'
),
default=get_default_opt_value(config, default_config, "templates_path"),
)
def init_email_client(options, **kwargs):
""" Initialize email client from calling script options """
"""Initialize email client from calling script options"""
from mylib.email import EmailClient # pylint: disable=import-outside-toplevel
log.info('Initialize Email client')
return EmailClient(
smtp_host=options.email_smtp_host,
smtp_port=options.email_smtp_port,
smtp_ssl=options.email_smtp_ssl,
smtp_tls=options.email_smtp_tls,
smtp_user=options.email_smtp_user,
smtp_password=options.email_smtp_password,
smtp_debug=options.email_smtp_debug,
sender_name=options.email_sender_name,
sender_email=options.email_sender_email,
catch_all_addr=options.email_catch_all,
just_try=options.just_try if hasattr(options, 'just_try') else False,
encoding=options.email_encoding,
**kwargs
log.info("Initialize Email client")
return EmailClient(options=options, initialize=True, **kwargs)
def add_sftp_opts(parser):
"""Add SFTP options to argpase.ArgumentParser"""
sftp_opts = parser.add_argument_group("SFTP options")
sftp_opts.add_argument(
"-H",
"--sftp-host",
action="store",
type=str,
dest="sftp_host",
help="SFTP Host (default: localhost)",
default="localhost",
)
sftp_opts.add_argument(
"--sftp-port",
action="store",
type=int,
dest="sftp_port",
help="SFTP Port (default: 22)",
default=22,
)
sftp_opts.add_argument(
"-u", "--sftp-user", action="store", type=str, dest="sftp_user", help="SFTP User"
)
sftp_opts.add_argument(
"-P",
"--sftp-password",
action="store",
type=str,
dest="sftp_password",
help="SFTP Password",
)
sftp_opts.add_argument(
"--sftp-known-hosts",
action="store",
type=str,
dest="sftp_known_hosts",
help="SFTP known_hosts file path (default: ~/.ssh/known_hosts)",
default=os.path.expanduser("~/.ssh/known_hosts"),
)
sftp_opts.add_argument(
"--sftp-auto-add-unknown-host-key",
action="store_true",
dest="sftp_auto_add_unknown_host_key",
help="Auto-add unknown SSH host key",
)
return sftp_opts

View file

@ -1,5 +1,3 @@
# -*- coding: utf-8 -*-
""" Test LDAP """
import datetime
import logging
@ -8,16 +6,14 @@ import sys
import dateutil.tz
import pytz
from mylib.ldap import format_datetime, format_date, parse_datetime, parse_date
from mylib.scripts.helpers import get_opts_parser
from mylib.scripts.helpers import init_logging
from mylib.ldap import format_date, format_datetime, parse_date, parse_datetime
from mylib.scripts.helpers import get_opts_parser, init_logging
log = logging.getLogger('mylib.scripts.ldap_test')
log = logging.getLogger("mylib.scripts.ldap_test")
def main(argv=None): # pylint: disable=too-many-locals,too-many-statements
""" Script main """
"""Script main"""
if argv is None:
argv = sys.argv[1:]
@ -26,52 +22,121 @@ def main(argv=None): # pylint: disable=too-many-locals,too-many-statements
options = parser.parse_args()
# Initialize logs
init_logging(options, 'Test LDAP helpers')
init_logging(options, "Test LDAP helpers")
now = datetime.datetime.now().replace(tzinfo=dateutil.tz.tzlocal())
print("Now = %s" % now)
print(f"Now = {now}")
datestring_now = format_datetime(now)
print("format_datetime : %s" % datestring_now)
print("format_datetime (from_timezone=utc) : %s" % format_datetime(now.replace(tzinfo=None), from_timezone=pytz.utc))
print("format_datetime (from_timezone=local) : %s" % format_datetime(now.replace(tzinfo=None), from_timezone=dateutil.tz.tzlocal()))
print("format_datetime (from_timezone='local') : %s" % format_datetime(now.replace(tzinfo=None), from_timezone='local'))
print("format_datetime (from_timezone=Paris) : %s" % format_datetime(now.replace(tzinfo=None), from_timezone='Europe/Paris'))
print("format_datetime (to_timezone=utc) : %s" % format_datetime(now, to_timezone=pytz.utc))
print("format_datetime (to_timezone=local) : %s" % format_datetime(now, to_timezone=dateutil.tz.tzlocal()))
print("format_datetime (to_timezone='local') : %s" % format_datetime(now, to_timezone='local'))
print("format_datetime (to_timezone=Tokyo) : %s" % format_datetime(now, to_timezone='Asia/Tokyo'))
print("format_datetime (naive=True) : %s" % format_datetime(now, naive=True))
print(f"format_datetime : {datestring_now}")
print(
"format_datetime (from_timezone=utc) :"
f" {format_datetime(now.replace(tzinfo=None), from_timezone=pytz.utc)}"
)
print(
"format_datetime (from_timezone=local) :"
f" {format_datetime(now.replace(tzinfo=None), from_timezone=dateutil.tz.tzlocal())}"
)
print(
"format_datetime (from_timezone=local) :"
f' {format_datetime(now.replace(tzinfo=None), from_timezone="local")}'
)
print(
"format_datetime (from_timezone=Paris) :"
f' {format_datetime(now.replace(tzinfo=None), from_timezone="Europe/Paris")}'
)
print(f"format_datetime (to_timezone=utc) : {format_datetime(now, to_timezone=pytz.utc)}")
print(
"format_datetime (to_timezone=local) :"
f" {format_datetime(now, to_timezone=dateutil.tz.tzlocal())}"
)
print(f'format_datetime (to_timezone=local) : {format_datetime(now, to_timezone="local")}')
print(f'format_datetime (to_timezone=Tokyo) : {format_datetime(now, to_timezone="Asia/Tokyo")}')
print(f"format_datetime (naive=True) : {format_datetime(now, naive=True)}")
print("format_date : %s" % format_date(now))
print("format_date (from_timezone=utc) : %s" % format_date(now.replace(tzinfo=None), from_timezone=pytz.utc))
print("format_date (from_timezone=local) : %s" % format_date(now.replace(tzinfo=None), from_timezone=dateutil.tz.tzlocal()))
print("format_date (from_timezone='local') : %s" % format_date(now.replace(tzinfo=None), from_timezone='local'))
print("format_date (from_timezone=Paris) : %s" % format_date(now.replace(tzinfo=None), from_timezone='Europe/Paris'))
print("format_date (to_timezone=utc) : %s" % format_date(now, to_timezone=pytz.utc))
print("format_date (to_timezone=local) : %s" % format_date(now, to_timezone=dateutil.tz.tzlocal()))
print("format_date (to_timezone='local') : %s" % format_date(now, to_timezone='local'))
print("format_date (to_timezone=Tokyo) : %s" % format_date(now, to_timezone='Asia/Tokyo'))
print("format_date (naive=True) : %s" % format_date(now, naive=True))
print(f"format_date : {format_date(now)}")
print(
"format_date (from_timezone=utc) :"
f" {format_date(now.replace(tzinfo=None), from_timezone=pytz.utc)}"
)
print(
"format_date (from_timezone=local) :"
f" {format_date(now.replace(tzinfo=None), from_timezone=dateutil.tz.tzlocal())}"
)
print(
"format_date (from_timezone=local) :"
f' {format_date(now.replace(tzinfo=None), from_timezone="local")}'
)
print(
"format_date (from_timezone=Paris) :"
f' {format_date(now.replace(tzinfo=None), from_timezone="Europe/Paris")}'
)
print(f"format_date (to_timezone=utc) : {format_date(now, to_timezone=pytz.utc)}")
print(
f"format_date (to_timezone=local) : {format_date(now, to_timezone=dateutil.tz.tzlocal())}"
)
print(f'format_date (to_timezone=local) : {format_date(now, to_timezone="local")}')
print(f'format_date (to_timezone=Tokyo) : {format_date(now, to_timezone="Asia/Tokyo")}')
print(f"format_date (naive=True) : {format_date(now, naive=True)}")
print("parse_datetime : %s" % parse_datetime(datestring_now))
print("parse_datetime (default_timezone=utc) : %s" % parse_datetime(datestring_now[0:-1], default_timezone=pytz.utc))
print("parse_datetime (default_timezone=local) : %s" % parse_datetime(datestring_now[0:-1], default_timezone=dateutil.tz.tzlocal()))
print("parse_datetime (default_timezone='local') : %s" % parse_datetime(datestring_now[0:-1], default_timezone='local'))
print("parse_datetime (default_timezone=Paris) : %s" % parse_datetime(datestring_now[0:-1], default_timezone='Europe/Paris'))
print("parse_datetime (to_timezone=utc) : %s" % parse_datetime(datestring_now, to_timezone=pytz.utc))
print("parse_datetime (to_timezone=local) : %s" % parse_datetime(datestring_now, to_timezone=dateutil.tz.tzlocal()))
print("parse_datetime (to_timezone='local') : %s" % parse_datetime(datestring_now, to_timezone='local'))
print("parse_datetime (to_timezone=Tokyo) : %s" % parse_datetime(datestring_now, to_timezone='Asia/Tokyo'))
print("parse_datetime (naive=True) : %s" % parse_datetime(datestring_now, naive=True))
print(f"parse_datetime : {parse_datetime(datestring_now)}")
print(
"parse_datetime (default_timezone=utc) :"
f" {parse_datetime(datestring_now[0:-1], default_timezone=pytz.utc)}"
)
print(
"parse_datetime (default_timezone=local) :"
f" {parse_datetime(datestring_now[0:-1], default_timezone=dateutil.tz.tzlocal())}"
)
print(
"parse_datetime (default_timezone=local) :"
f' {parse_datetime(datestring_now[0:-1], default_timezone="local")}'
)
print(
"parse_datetime (default_timezone=Paris) :"
f' {parse_datetime(datestring_now[0:-1], default_timezone="Europe/Paris")}'
)
print(
f"parse_datetime (to_timezone=utc) : {parse_datetime(datestring_now, to_timezone=pytz.utc)}"
)
print(
"parse_datetime (to_timezone=local) :"
f" {parse_datetime(datestring_now, to_timezone=dateutil.tz.tzlocal())}"
)
print(
"parse_datetime (to_timezone=local) :"
f' {parse_datetime(datestring_now, to_timezone="local")}'
)
print(
"parse_datetime (to_timezone=Tokyo) :"
f' {parse_datetime(datestring_now, to_timezone="Asia/Tokyo")}'
)
print(f"parse_datetime (naive=True) : {parse_datetime(datestring_now, naive=True)}")
print("parse_date : %s" % parse_date(datestring_now))
print("parse_date (default_timezone=utc) : %s" % parse_date(datestring_now[0:-1], default_timezone=pytz.utc))
print("parse_date (default_timezone=local) : %s" % parse_date(datestring_now[0:-1], default_timezone=dateutil.tz.tzlocal()))
print("parse_date (default_timezone='local') : %s" % parse_date(datestring_now[0:-1], default_timezone='local'))
print("parse_date (default_timezone=Paris) : %s" % parse_date(datestring_now[0:-1], default_timezone='Europe/Paris'))
print("parse_date (to_timezone=utc) : %s" % parse_date(datestring_now, to_timezone=pytz.utc))
print("parse_date (to_timezone=local) : %s" % parse_date(datestring_now, to_timezone=dateutil.tz.tzlocal()))
print("parse_date (to_timezone='local') : %s" % parse_date(datestring_now, to_timezone='local'))
print("parse_date (to_timezone=Tokyo) : %s" % parse_date(datestring_now, to_timezone='Asia/Tokyo'))
print("parse_date (naive=True) : %s" % parse_date(datestring_now, naive=True))
print(f"parse_date : {parse_date(datestring_now)}")
print(
"parse_date (default_timezone=utc) :"
f" {parse_date(datestring_now[0:-1], default_timezone=pytz.utc)}"
)
print(
"parse_date (default_timezone=local) :"
f" {parse_date(datestring_now[0:-1], default_timezone=dateutil.tz.tzlocal())}"
)
print(
"parse_date (default_timezone=local) :"
f' {parse_date(datestring_now[0:-1], default_timezone="local")}'
)
print(
"parse_date (default_timezone=Paris) :"
f' {parse_date(datestring_now[0:-1], default_timezone="Europe/Paris")}'
)
print(f"parse_date (to_timezone=utc) : {parse_date(datestring_now, to_timezone=pytz.utc)}")
print(
"parse_date (to_timezone=local) :"
f" {parse_date(datestring_now, to_timezone=dateutil.tz.tzlocal())}"
)
print(f'parse_date (to_timezone=local) : {parse_date(datestring_now, to_timezone="local")}')
print(
f'parse_date (to_timezone=Tokyo) : {parse_date(datestring_now, to_timezone="Asia/Tokyo")}'
)
print(f"parse_date (naive=True) : {parse_date(datestring_now, naive=True)}")

69
mylib/scripts/map_test.py Normal file
View file

@ -0,0 +1,69 @@
""" Test mapping """
import logging
import sys
from mylib import pretty_format_value
from mylib.mapping import map_hash
from mylib.scripts.helpers import get_opts_parser, init_logging
log = logging.getLogger(__name__)
def main(argv=None):
"""Script main"""
if argv is None:
argv = sys.argv[1:]
# Options parser
parser = get_opts_parser(progress=True)
options = parser.parse_args()
# Initialize logs
init_logging(options, "Test mapping")
src = {
"uid": "hmartin",
"firstname": "Martin",
"lastname": "Martin",
"disp_name": "Henri Martin",
"line_1": "3 rue de Paris",
"line_2": "Pour Pierre",
"zip_text": "92 120",
"city_text": "Montrouge",
"line_city": "92120 Montrouge",
"tel1": "01 00 00 00 00",
"tel2": "09 00 00 00 00",
"mobile": "06 00 00 00 00",
"fax": "01 00 00 00 00",
"email": "H.MARTIN@GMAIL.COM",
}
map_c = {
"uid": {"order": 0, "key": "uid", "required": True},
"givenName": {"order": 1, "key": "firstname"},
"sn": {"order": 2, "key": "lastname"},
"cn": {
"order": 3,
"key": "disp_name",
"required": True,
"or": {"attrs": ["firstname", "lastname"], "join": " "},
},
"displayName": {"order": 4, "other_key": "displayName"},
"street": {"order": 5, "join": " / ", "keys": ["ligne_1", "ligne_2"]},
"postalCode": {"order": 6, "key": "zip_text", "cleanRegex": "[^0-9]"},
"l": {"order": 7, "key": "city_text"},
"postalAddress": {"order": 8, "join": "$", "keys": ["ligne_1", "ligne_2", "ligne_city"]},
"telephoneNumber": {
"order": 9,
"keys": ["tel1", "tel2"],
"cleanRegex": "[^0-9+]",
"deduplicate": True,
},
"mobile": {"order": 10, "key": "mobile"},
"facsimileTelephoneNumber": {"order": 11, "key": "fax"},
"mail": {"order": 12, "key": "email", "convert": lambda x: x.lower().strip()},
}
print("Mapping source:\n" + pretty_format_value(src))
print("Mapping config:\n" + pretty_format_value(map_c))
print("Mapping result:\n" + pretty_format_value(map_hash(map_c, src)))

View file

@ -1,20 +1,16 @@
# -*- coding: utf-8 -*-
""" Test Progress bar """
import logging
import time
import sys
import time
from mylib.pbar import Pbar
from mylib.scripts.helpers import get_opts_parser
from mylib.scripts.helpers import init_logging
from mylib.scripts.helpers import get_opts_parser, init_logging
log = logging.getLogger('mylib.scripts.pbar_test')
log = logging.getLogger("mylib.scripts.pbar_test")
def main(argv=None): # pylint: disable=too-many-locals,too-many-statements
""" Script main """
"""Script main"""
if argv is None:
argv = sys.argv[1:]
@ -23,20 +19,21 @@ def main(argv=None): # pylint: disable=too-many-locals,too-many-statements
parser = get_opts_parser(progress=True)
parser.add_argument(
'-c', '--count',
"-c",
"--count",
action="store",
type=int,
dest="count",
help="Progress bar max value (default: %s)" % default_max_val,
default=default_max_val
help=f"Progress bar max value (default: {default_max_val})",
default=default_max_val,
)
options = parser.parse_args()
# Initialize logs
init_logging(options, 'Test Pbar')
init_logging(options, "Test Pbar")
pbar = Pbar('Test', options.count, enabled=options.progress)
pbar = Pbar("Test", options.count, enabled=options.progress)
for idx in range(0, options.count): # pylint: disable=unused-variable
pbar.increment()

View file

@ -1,19 +1,15 @@
# -*- coding: utf-8 -*-
""" Test report """
import logging
import sys
from mylib.report import Report
from mylib.scripts.helpers import get_opts_parser, add_email_opts
from mylib.scripts.helpers import init_logging, init_email_client
from mylib.scripts.helpers import add_email_opts, get_opts_parser, init_email_client, init_logging
log = logging.getLogger('mylib.scripts.report_test')
log = logging.getLogger("mylib.scripts.report_test")
def main(argv=None): # pylint: disable=too-many-locals,too-many-statements
""" Script main """
"""Script main"""
if argv is None:
argv = sys.argv[1:]
@ -21,29 +17,30 @@ def main(argv=None): # pylint: disable=too-many-locals,too-many-statements
parser = get_opts_parser(just_try=True)
add_email_opts(parser)
report_opts = parser.add_argument_group('Report options')
report_opts = parser.add_argument_group("Report options")
report_opts.add_argument(
'-t', '--to',
"-t",
"--to",
action="store",
type=str,
dest="report_rcpt",
help="Send report to this email"
dest="report_recipient",
help="Send report to this email",
)
options = parser.parse_args()
if not options.report_rcpt:
if not options.report_recipient:
parser.error("You must specify a report recipient using -t/--to parameter")
# Initialize logs
report = Report(rcpt_to=options.report_rcpt, subject='Test report')
init_logging(options, 'Test Report', report=report)
report = Report(options=options, subject="Test report")
init_logging(options, "Test Report", report=report)
email_client = init_email_client(options)
report.send_at_exit(email_client=email_client)
logging.debug('Test debug message')
logging.info('Test info message')
logging.warning('Test warning message')
logging.error('Test error message')
logging.debug("Test debug message")
logging.info("Test info message")
logging.warning("Test warning message")
logging.error("Test error message")

106
mylib/scripts/sftp_test.py Normal file
View file

@ -0,0 +1,106 @@
""" Test SFTP client """
import atexit
import getpass
import logging
import os
import random
import string
import sys
import tempfile
from mylib.scripts.helpers import add_sftp_opts, get_opts_parser, init_logging
from mylib.sftp import SFTPClient
log = logging.getLogger("mylib.scripts.sftp_test")
def main(argv=None): # pylint: disable=too-many-locals,too-many-statements
"""Script main"""
if argv is None:
argv = sys.argv[1:]
# Options parser
parser = get_opts_parser(just_try=True)
add_sftp_opts(parser)
test_opts = parser.add_argument_group("Test SFTP options")
test_opts.add_argument(
"-p",
"--remote-upload-path",
action="store",
type=str,
dest="upload_path",
help="Remote upload path (default: on remote initial connection directory)",
)
options = parser.parse_args()
# Initialize logs
init_logging(options, "Test SFTP client")
if options.sftp_user and not options.sftp_password:
options.sftp_password = getpass.getpass("Please enter SFTP password: ")
log.info("Initialize Email client")
sftp = SFTPClient(options=options)
sftp.connect()
atexit.register(sftp.close)
log.debug("Create temporary file")
test_content = b"Juste un test."
tmp_dir = tempfile.TemporaryDirectory() # pylint: disable=consider-using-with
tmp_file = os.path.join(
tmp_dir.name,
f'tmp{"".join(random.choice(string.ascii_lowercase) for i in range(8))}', # nosec
)
log.debug('Temporary file path: "%s"', tmp_file)
with open(tmp_file, "wb") as file_desc:
file_desc.write(test_content)
log.debug(
"Upload file %s to SFTP server (in %s)",
tmp_file,
options.upload_path if options.upload_path else "remote initial connection directory",
)
if not sftp.upload_file(tmp_file, options.upload_path):
log.error("Fail to upload test file on SFTP server")
sys.exit(1)
log.info("Test file uploaded on SFTP server")
remote_filepath = (
os.path.join(options.upload_path, os.path.basename(tmp_file))
if options.upload_path
else os.path.basename(tmp_file)
)
if not sftp._just_try: # pylint: disable=protected-access
with tempfile.NamedTemporaryFile() as tmp_file2:
log.info("Retrieve test file to %s", tmp_file2.name)
if not sftp.get_file(remote_filepath, tmp_file2.name):
log.error("Fail to retrieve test file")
else:
with open(tmp_file2.name, "rb") as file_desc:
content = file_desc.read()
log.debug("Read content: %s", content)
if test_content == content:
log.info("Content file retrieved match with uploaded one")
else:
log.error("Content file retrieved doest not match with uploaded one")
try:
log.info("Remotly open test file %s", remote_filepath)
file_desc = sftp.open_file(remote_filepath)
content = file_desc.read()
log.debug("Read content: %s", content)
if test_content == content:
log.info("Content of remote file match with uploaded one")
else:
log.error("Content of remote file doest not match with uploaded one")
except Exception: # pylint: disable=broad-except
log.exception("An exception occurred remotly opening test file %s", remote_filepath)
if sftp.remove_file(remote_filepath):
log.info("Test file removed on SFTP server")
else:
log.error("Fail to remove test file on SFTP server")

View file

@ -0,0 +1,12 @@
""" Test telltale file """
import logging
from mylib.scripts.telltale_test import default_filepath
from mylib.telltale import TelltaleFile
log = logging.getLogger(__name__)
def main(argv=None):
"""Script main"""
TelltaleFile.check_entrypoint(argv=argv, default_filepath=default_filepath)

View file

@ -0,0 +1,40 @@
""" Test telltale file """
import logging
import os.path
import sys
import tempfile
from mylib.scripts.helpers import get_opts_parser, init_logging
from mylib.telltale import TelltaleFile
log = logging.getLogger(__name__)
default_filepath = os.path.join(tempfile.gettempdir(), f"{__name__}.last")
def main(argv=None):
"""Script main"""
if argv is None:
argv = sys.argv[1:]
# Options parser
parser = get_opts_parser()
options = parser.parse_args()
parser.add_argument(
"-p",
"--telltale-file-path",
action="store",
type=str,
dest="telltale_file_path",
help=f"Telltale file path (default: {default_filepath})",
default=default_filepath,
)
options = parser.parse_args()
# Initialize logs
init_logging(options, __doc__)
telltale_file = TelltaleFile(filepath=options.telltale_file_path)
telltale_file.update()

162
mylib/sftp.py Normal file
View file

@ -0,0 +1,162 @@
""" SFTP client """
import logging
import os
from paramiko import AutoAddPolicy, SFTPAttributes, SSHClient
from mylib.config import (
BooleanOption,
ConfigurableObject,
IntegerOption,
PasswordOption,
StringOption,
)
log = logging.getLogger(__name__)
class SFTPClient(ConfigurableObject):
"""
SFTP client
This class abstract all interactions with the SFTP server.
"""
_config_name = "sftp"
_config_comment = "SFTP"
_defaults = {
"host": "localhost",
"port": 22,
"user": None,
"password": None,
"known_hosts": os.path.expanduser("~/.ssh/known_hosts"),
"auto_add_unknown_host_key": False,
"just_try": False,
}
ssh_client = None
sftp_client = None
initial_directory = None
# pylint: disable=arguments-differ,arguments-renamed
def configure(self, **kwargs):
"""Configure options on registered mylib.Config object"""
section = super().configure(
just_try=kwargs.pop("just_try", True),
just_try_help=kwargs.pop(
"just_try_help", "Just-try mode: do not really make change on remote SFTP host"
),
**kwargs,
)
section.add_option(
StringOption,
"host",
default=self._defaults["host"],
comment="SFTP server hostname/IP address",
)
section.add_option(
IntegerOption, "port", default=self._defaults["port"], comment="SFTP server port"
)
section.add_option(
StringOption,
"user",
default=self._defaults["user"],
comment="SFTP authentication username",
)
section.add_option(
PasswordOption,
"password",
default=self._defaults["password"],
comment='SFTP authentication password (set to "keyring" to use XDG keyring)',
username_option="user",
keyring_value="keyring",
)
section.add_option(
StringOption,
"known_hosts",
default=self._defaults["known_hosts"],
comment="SFTP known_hosts filepath",
)
section.add_option(
BooleanOption,
"auto_add_unknown_host_key",
default=self._defaults["auto_add_unknown_host_key"],
comment="Auto add unknown host key",
)
return section
def initialize(self, loaded_config=None):
"""Configuration initialized hook"""
super().__init__(loaded_config=loaded_config)
def connect(self):
"""Connect to SFTP server"""
if self.ssh_client:
return
host = self._get_option("host")
port = self._get_option("port")
log.info("Connect to SFTP server %s:%d", host, port)
self.ssh_client = SSHClient()
if self._get_option("known_hosts"):
self.ssh_client.load_host_keys(self._get_option("known_hosts"))
if self._get_option("auto_add_unknown_host_key"):
log.debug("Set missing host key policy to auto-add")
self.ssh_client.set_missing_host_key_policy(AutoAddPolicy())
self.ssh_client.connect(
host,
port=port,
username=self._get_option("user"),
password=self._get_option("password"),
)
self.sftp_client = self.ssh_client.open_sftp()
self.initial_directory = self.sftp_client.getcwd()
if self.initial_directory:
log.debug("Initial remote directory: '%s'", self.initial_directory)
else:
log.debug("Fail to retrieve remote directory, use empty string instead")
self.initial_directory = ""
def get_file(self, remote_filepath, local_filepath):
"""Retrieve a file from SFTP server"""
self.connect()
log.debug("Retrieve file '%s' to '%s'", remote_filepath, local_filepath)
return self.sftp_client.get(remote_filepath, local_filepath) is None
def open_file(self, remote_filepath, mode="r"):
"""Remotly open a file on SFTP server"""
self.connect()
log.debug("Remotly open file '%s'", remote_filepath)
return self.sftp_client.open(remote_filepath, mode=mode)
def upload_file(self, filepath, remote_directory=None):
"""Upload a file on SFTP server"""
self.connect()
remote_filepath = os.path.join(
remote_directory if remote_directory else self.initial_directory,
os.path.basename(filepath),
)
log.debug("Upload file '%s' to '%s'", filepath, remote_filepath)
if self._just_try:
log.debug(
"Just-try mode: do not really upload file '%s' to '%s'", filepath, remote_filepath
)
return True
result = self.sftp_client.put(filepath, remote_filepath)
return isinstance(result, SFTPAttributes)
def remove_file(self, filepath):
"""Remove a file on SFTP server"""
self.connect()
log.debug("Remove file '%s'", filepath)
if self._just_try:
log.debug("Just - try mode: do not really remove file '%s'", filepath)
return True
return self.sftp_client.remove(filepath) is None
def close(self):
"""Close SSH/SFTP connection"""
log.debug("Close connection")
self.ssh_client.close()

View file

@ -1,53 +1,165 @@
""" Telltale files helpers """
import argparse
import datetime
import logging
import os
import sys
from mylib import pretty_format_timedelta
from mylib.scripts.helpers import get_opts_parser, init_logging
log = logging.getLogger(__name__)
DEFAULT_WARNING_THRESHOLD = 90
DEFAULT_CRITICAL_THRESHOLD = 240
class TelltaleFile:
""" Telltale file helper class """
"""Telltale file helper class"""
def __init__(self, filepath=None, filename=None, dirpath=None):
assert filepath or filename, "filename or filepath is required"
if filepath:
assert not filename or os.path.basename(filepath) == filename, "filepath and filename does not match"
assert not dirpath or os.path.dirname(filepath) == dirpath, "filepath and dirpath does not match"
assert (
not filename or os.path.basename(filepath) == filename
), "filepath and filename does not match"
assert (
not dirpath or os.path.dirname(filepath) == dirpath
), "filepath and dirpath does not match"
self.filename = filename if filename else os.path.basename(filepath)
self.dirpath = (
dirpath if dirpath
else (
os.path.dirname(filepath) if filepath
else os.getcwd()
)
dirpath if dirpath else (os.path.dirname(filepath) if filepath else os.getcwd())
)
self.filepath = filepath if filepath else os.path.join(self.dirpath, self.filename)
@property
def last_update(self):
""" Retreive last update datetime of the telltall file """
"""Retrieve last update datetime of the telltall file"""
try:
return datetime.datetime.fromtimestamp(
os.stat(self.filepath).st_mtime
)
return datetime.datetime.fromtimestamp(os.stat(self.filepath).st_mtime)
except FileNotFoundError:
log.info('Telltale file not found (%s)', self.filepath)
log.info("Telltale file not found (%s)", self.filepath)
return None
def update(self):
""" Update the telltale file """
log.info('Update telltale file (%s)', self.filepath)
"""Update the telltale file"""
log.info("Update telltale file (%s)", self.filepath)
try:
os.utime(self.filepath, None)
except FileNotFoundError:
open(self.filepath, 'a').close()
# pylint: disable=consider-using-with
open(self.filepath, "a", encoding="utf-8").close()
def remove(self):
""" Remove the telltale file """
"""Remove the telltale file"""
try:
os.remove(self.filepath)
return True
except FileNotFoundError:
return True
@classmethod
def check_entrypoint(
cls,
argv=None,
description=None,
default_filepath=None,
default_warning_threshold=None,
default_critical_threshold=None,
fail_message=None,
success_message=None,
):
"""Entry point of the script to check a telltale file last update"""
argv = argv if argv else sys.argv
description = description if description else "Check last execution date"
parser = get_opts_parser(desc=description, exit_on_error=False)
parser.add_argument(
"-p",
"--telltale-file-path",
action="store",
type=str,
dest="telltale_file_path",
help=f"Telltale file path (default: {default_filepath})",
default=default_filepath,
required=not default_filepath,
)
default_warning_threshold = (
default_warning_threshold
if default_warning_threshold is not None
else DEFAULT_WARNING_THRESHOLD
)
default_critical_threshold = (
default_critical_threshold
if default_critical_threshold is not None
else DEFAULT_CRITICAL_THRESHOLD
)
parser.add_argument(
"-w",
"--warning",
type=int,
dest="warning",
help=(
"Specify warning threshold (in minutes, default: "
f"{default_warning_threshold} minutes)"
),
default=default_warning_threshold,
)
parser.add_argument(
"-c",
"--critical",
type=int,
dest="critical",
help=(
"Specify critical threshold (in minutes, default: "
f"{default_critical_threshold} minutes)"
),
default=default_critical_threshold,
)
try:
options = parser.parse_args(argv[1:])
except argparse.ArgumentError as err:
print(f"UNKNOWN - {err}")
sys.exit(3)
# Initialize logs
init_logging(options, argv[0])
telltale_file = cls(filepath=options.telltale_file_path)
last = telltale_file.last_update
if not last:
status = "UNKNOWN"
exit_code = 3
msg = (
fail_message
if fail_message
else "Fail to retrieve last successful date of execution"
)
else:
delay = datetime.datetime.now() - last
msg = (
success_message
if success_message
else "Last successful execution was {last_delay} ago ({last_date})"
).format(
last_delay=pretty_format_timedelta(delay),
last_date=last.strftime("%Y/%m/%d %H:%M:%S"),
)
if delay >= datetime.timedelta(minutes=options.critical):
status = "CRITICAL"
exit_code = 2
elif delay >= datetime.timedelta(minutes=options.warning):
status = "WARNING"
exit_code = 1
else:
status = "OK"
exit_code = 0
print(f"{status} - {msg}")
sys.exit(exit_code)

View file

@ -1,2 +1,3 @@
[flake8]
ignore = E501,W503
max-line-length = 100

View file

@ -1,70 +1,87 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""Setuptools script"""
from setuptools import find_packages
from setuptools import setup
from setuptools import find_packages, setup
extras_require={
'dev': [
'pytest',
'mocker',
'pytest-mock',
'pylint',
extras_require = {
"dev": [
"pytest",
"mocker",
"pytest-mock",
"pylint == 2.15.10",
"pre-commit",
],
'config': [
'argcomplete',
'keyring',
'systemd-python',
"config": [
"argcomplete",
"keyring",
"systemd-python",
],
'ldap': [
'python-ldap',
'python-dateutil',
'pytz',
"ldap": [
"python-ldap",
"python-dateutil",
"pytz",
],
'email': [
'mako',
"email": [
"mako",
],
'pgsql': [
'psycopg2',
"pgsql": [
"psycopg2",
],
'oracle': [
'cx_Oracle',
"oracle": [
"cx_Oracle",
],
'mysql': [
'mysqlclient',
"mysql": [
"mysqlclient",
],
"sftp": [
"paramiko",
],
}
install_requires = ['progressbar']
install_requires = ["progressbar"]
for extra, deps in extras_require.items():
if extra != 'dev':
if extra != "dev":
install_requires.extend(deps)
version = '0.1'
version = "0.1"
with open("README.md", encoding="utf-8") as fd:
long_description = fd.read()
setup(
name="mylib",
version=version,
description='A set of helpers small libs to make common tasks easier in my script development',
description="A set of helpers small libs to make common tasks easier in my script development",
long_description=long_description,
classifiers=[
'Programming Language :: Python',
"Programming Language :: Python",
],
install_requires=install_requires,
extras_require=extras_require,
author='Benjamin Renard',
author_email='brenard@zionetrix.net',
url='https://gogs.zionetrix.net/bn8/python-mylib',
author="Benjamin Renard",
author_email="brenard@zionetrix.net",
url="https://gogs.zionetrix.net/bn8/python-mylib",
packages=find_packages(),
include_package_data=True,
package_data={
"": [
"scripts/email_templates/*.subject",
"scripts/email_templates/*.txt",
"scripts/email_templates/*.html",
],
},
zip_safe=False,
entry_points={
'console_scripts': [
'mylib-test-email = mylib.scripts.email_test:main',
'mylib-test-email-with-config = mylib.scripts.email_test_with_config:main',
'mylib-test-pbar = mylib.scripts.pbar_test:main',
'mylib-test-report = mylib.scripts.report_test:main',
'mylib-test-ldap = mylib.scripts.ldap_test:main',
"console_scripts": [
"mylib-test-email = mylib.scripts.email_test:main",
"mylib-test-email-with-config = mylib.scripts.email_test_with_config:main",
"mylib-test-map = mylib.scripts.map_test:main",
"mylib-test-pbar = mylib.scripts.pbar_test:main",
"mylib-test-report = mylib.scripts.report_test:main",
"mylib-test-ldap = mylib.scripts.ldap_test:main",
"mylib-test-sftp = mylib.scripts.sftp_test:main",
"mylib-test-telltale = mylib.scripts.telltale_test:main",
"mylib-test-telltale-check = mylib.scripts.telltale_check_test:main",
],
},
)

View file

@ -1,29 +1,75 @@
#!/bin/bash
QUIET_ARG=""
NO_VENV=0
function usage() {
[ -n "$1" ] && echo -e "$1\n" > /dev/stderr
echo "Usage: $0 [-x] [-q|--quiet] [--no-venv]"
echo " -h/--help Show usage message"
echo " -q/--quiet Enable quiet mode"
echo " -n/--no-venv Disable venv creation and run tests on system environment"
echo " -x Enable debug mode"
[ -n "$1" ] && exit 1
exit 0
}
idx=1
while [ $idx -le $# ]
do
OPT=${!idx}
case $OPT in
-h|--help)
usage
;;
-q|--quiet)
QUIET_ARG="--quiet"
;;
-n|--no-venv)
NO_VENV=1
;;
-x)
set -x
;;
*)
usage "Unknown parameter '$OPT'"
esac
let idx=idx+1
done
[ "$1" == "--quiet" ] && QUIET_ARG="--quiet"
# Enter source directory
cd $( dirname $0 )
if [ -d venv ]
TEMP_VENV=0
VENV=""
if [ $NO_VENV -eq 1 ]
then
VENV=$( realpath venv )
TEMP_VENV=0
echo "Run tests in system environment..."
elif [ -d venv ]
then
VENV=$( realpath venv )
echo "Using existing virtualenv ($VENV)..."
else
# Create a temporary venv
VENV=$(mktemp -d)
echo "Create a temporary virtualenv in $VENV to install dependencies..."
TEMP_VENV=1
python3 -m venv $VENV
# Create a temporary venv
VENV=$(mktemp -d)
echo "Create a temporary virtualenv in $VENV..."
TEMP_VENV=1
python3 -m venv $VENV
fi
echo "Install package with dev dependencies using pip..."
$VENV/bin/python3 -m pip install -e ".[dev]" $QUIET_ARG
if [ -n "$VENV" ]
then
echo "Install package with dev dependencies using pip in virtualenv..."
$VENV/bin/python3 -m pip install -e ".[dev]" $QUIET_ARG
source $VENV/bin/activate
fi
# Run tests
$VENV/bin/python3 -m pytest tests
RES=$?
# Run pre-commit
RES=0
echo "Run pre-commit..."
pre-commit run --all-files
[ $? -ne 0 ] && RES=1
# Clean temporary venv
[ $TEMP_VENV -eq 1 ] && rm -fr $VENV

View file

@ -1,24 +1,30 @@
# pylint: disable=redefined-outer-name,missing-function-docstring,protected-access,global-statement
# pylint: disable=global-variable-not-assigned
""" Tests on config lib """
from mylib.config import Config, ConfigSection
from mylib.config import StringOption
import configparser
import logging
import os
runned = dict()
import pytest
from mylib.config import BooleanOption, Config, ConfigSection, StringOption
tested = {}
def test_config_init_default_args():
appname = 'Test app'
appname = "Test app"
config = Config(appname)
assert config.appname == appname
assert config.version == '0.0'
assert config.encoding == 'utf-8'
assert config.version == "0.0"
assert config.encoding == "utf-8"
def test_config_init_custom_args():
appname = 'Test app'
version = '1.43'
encoding = 'ISO-8859-1'
appname = "Test app"
version = "1.43"
encoding = "ISO-8859-1"
config = Config(appname, version=version, encoding=encoding)
assert config.appname == appname
assert config.version == version
@ -26,8 +32,8 @@ def test_config_init_custom_args():
def test_add_section_default_args():
config = Config('Test app')
name = 'test_section'
config = Config("Test app")
name = "test_section"
section = config.add_section(name)
assert isinstance(section, ConfigSection)
assert config.sections[name] == section
@ -37,9 +43,9 @@ def test_add_section_default_args():
def test_add_section_custom_args():
config = Config('Test app')
name = 'test_section'
comment = 'Test'
config = Config("Test app")
name = "test_section"
comment = "Test"
order = 20
section = config.add_section(name, comment=comment, order=order)
assert isinstance(section, ConfigSection)
@ -49,58 +55,58 @@ def test_add_section_custom_args():
def test_add_section_with_callback():
config = Config('Test app')
name = 'test_section'
config = Config("Test app")
name = "test_section"
global runned
runned['test_add_section_with_callback'] = False
global tested
tested["test_add_section_with_callback"] = False
def test_callback(loaded_config):
global runned
global tested
assert loaded_config == config
assert runned['test_add_section_with_callback'] is False
runned['test_add_section_with_callback'] = True
assert tested["test_add_section_with_callback"] is False
tested["test_add_section_with_callback"] = True
section = config.add_section(name, loaded_callback=test_callback)
assert isinstance(section, ConfigSection)
assert test_callback in config._loaded_callbacks
assert runned['test_add_section_with_callback'] is False
assert tested["test_add_section_with_callback"] is False
config.parse_arguments_options(argv=[], create=False)
assert runned['test_add_section_with_callback'] is True
assert tested["test_add_section_with_callback"] is True
assert test_callback in config._loaded_callbacks_executed
# Try to execute again to verify callback is not runned again
# Try to execute again to verify callback is not tested again
config._loaded()
def test_add_section_with_callback_already_loaded():
config = Config('Test app')
name = 'test_section'
config = Config("Test app")
name = "test_section"
config.parse_arguments_options(argv=[], create=False)
global runned
runned['test_add_section_with_callback_already_loaded'] = False
global tested
tested["test_add_section_with_callback_already_loaded"] = False
def test_callback(loaded_config):
global runned
global tested
assert loaded_config == config
assert runned['test_add_section_with_callback_already_loaded'] is False
runned['test_add_section_with_callback_already_loaded'] = True
assert tested["test_add_section_with_callback_already_loaded"] is False
tested["test_add_section_with_callback_already_loaded"] = True
section = config.add_section(name, loaded_callback=test_callback)
assert isinstance(section, ConfigSection)
assert runned['test_add_section_with_callback_already_loaded'] is True
assert tested["test_add_section_with_callback_already_loaded"] is True
assert test_callback in config._loaded_callbacks
assert test_callback in config._loaded_callbacks_executed
# Try to execute again to verify callback is not runned again
# Try to execute again to verify callback is not tested again
config._loaded()
def test_add_option_default_args():
config = Config('Test app')
section = config.add_section('my_section')
config = Config("Test app")
section = config.add_section("my_section")
assert isinstance(section, ConfigSection)
name = 'my_option'
name = "my_option"
option = section.add_option(StringOption, name)
assert isinstance(option, StringOption)
assert name in section.options and section.options[name] == option
@ -116,18 +122,18 @@ def test_add_option_default_args():
def test_add_option_custom_args():
config = Config('Test app')
section = config.add_section('my_section')
config = Config("Test app")
section = config.add_section("my_section")
assert isinstance(section, ConfigSection)
name = 'my_option'
kwargs = dict(
default='default value',
comment='my comment',
no_arg=True,
arg='--my-option',
short_arg='-M',
arg_help='My help'
)
name = "my_option"
kwargs = {
"default": "default value",
"comment": "my comment",
"no_arg": True,
"arg": "--my-option",
"short_arg": "-M",
"arg_help": "My help",
}
option = section.add_option(StringOption, name, **kwargs)
assert isinstance(option, StringOption)
assert name in section.options and section.options[name] == option
@ -140,12 +146,12 @@ def test_add_option_custom_args():
def test_defined():
config = Config('Test app')
section_name = 'my_section'
opt_name = 'my_option'
config = Config("Test app")
section_name = "my_section"
opt_name = "my_option"
assert not config.defined(section_name, opt_name)
section = config.add_section('my_section')
section = config.add_section("my_section")
assert isinstance(section, ConfigSection)
section.add_option(StringOption, opt_name)
@ -153,31 +159,31 @@ def test_defined():
def test_isset():
config = Config('Test app')
section_name = 'my_section'
opt_name = 'my_option'
config = Config("Test app")
section_name = "my_section"
opt_name = "my_option"
assert not config.isset(section_name, opt_name)
section = config.add_section('my_section')
section = config.add_section("my_section")
assert isinstance(section, ConfigSection)
option = section.add_option(StringOption, opt_name)
assert not config.isset(section_name, opt_name)
config.parse_arguments_options(argv=[option.parser_argument_name, 'value'], create=False)
config.parse_arguments_options(argv=[option.parser_argument_name, "value"], create=False)
assert config.isset(section_name, opt_name)
def test_not_isset():
config = Config('Test app')
section_name = 'my_section'
opt_name = 'my_option'
config = Config("Test app")
section_name = "my_section"
opt_name = "my_option"
assert not config.isset(section_name, opt_name)
section = config.add_section('my_section')
section = config.add_section("my_section")
assert isinstance(section, ConfigSection)
option = section.add_option(StringOption, opt_name)
section.add_option(StringOption, opt_name)
assert not config.isset(section_name, opt_name)
@ -187,11 +193,11 @@ def test_not_isset():
def test_get():
config = Config('Test app')
section_name = 'my_section'
opt_name = 'my_option'
opt_value = 'value'
section = config.add_section('my_section')
config = Config("Test app")
section_name = "my_section"
opt_name = "my_option"
opt_value = "value"
section = config.add_section("my_section")
option = section.add_option(StringOption, opt_name)
config.parse_arguments_options(argv=[option.parser_argument_name, opt_value], create=False)
@ -199,12 +205,142 @@ def test_get():
def test_get_default():
config = Config('Test app')
section_name = 'my_section'
opt_name = 'my_option'
opt_default_value = 'value'
section = config.add_section('my_section')
option = section.add_option(StringOption, opt_name, default=opt_default_value)
config = Config("Test app")
section_name = "my_section"
opt_name = "my_option"
opt_default_value = "value"
section = config.add_section("my_section")
section.add_option(StringOption, opt_name, default=opt_default_value)
config.parse_arguments_options(argv=[], create=False)
assert config.get(section_name, opt_name) == opt_default_value
def test_logging_splited_stdout_stderr(capsys):
config = Config("Test app")
config.parse_arguments_options(argv=["-C", "-v"], create=False)
info_msg = "[info]"
err_msg = "[error]"
logging.getLogger().info(info_msg)
logging.getLogger().error(err_msg)
captured = capsys.readouterr()
assert info_msg in captured.out
assert info_msg not in captured.err
assert err_msg in captured.err
assert err_msg not in captured.out
#
# Test option types
#
@pytest.fixture()
def config_with_file(tmpdir):
config = Config("Test app")
config_dir = tmpdir.mkdir("config")
config_file = config_dir.join("config.ini")
config.save(os.path.join(config_file.dirname, config_file.basename))
return config
def generate_mock_input(expected_prompt, input_value):
def mock_input(self, prompt): # pylint: disable=unused-argument
assert prompt == expected_prompt
return input_value
return mock_input
# Boolean option
def test_boolean_option_from_config(config_with_file):
section = config_with_file.add_section("test")
default = True
option = section.add_option(BooleanOption, "test_bool", default=default)
config_with_file.save()
option.set(not default)
assert option._from_config is not default
option.set(default)
assert not option._isset_in_config_file
with pytest.raises(configparser.NoOptionError):
assert option._from_config is default
def test_boolean_option_ask_value(mocker):
config = Config("Test app")
section = config.add_section("test")
name = "test_bool"
option = section.add_option(BooleanOption, name, default=True)
mocker.patch(
"mylib.config.BooleanOption._get_user_input", generate_mock_input(f"{name}: [Y/n] ", "y")
)
assert option.ask_value(set_it=False) is True
mocker.patch(
"mylib.config.BooleanOption._get_user_input", generate_mock_input(f"{name}: [Y/n] ", "Y")
)
assert option.ask_value(set_it=False) is True
mocker.patch(
"mylib.config.BooleanOption._get_user_input", generate_mock_input(f"{name}: [Y/n] ", "")
)
assert option.ask_value(set_it=False) is True
mocker.patch(
"mylib.config.BooleanOption._get_user_input", generate_mock_input(f"{name}: [Y/n] ", "n")
)
assert option.ask_value(set_it=False) is False
mocker.patch(
"mylib.config.BooleanOption._get_user_input", generate_mock_input(f"{name}: [Y/n] ", "N")
)
assert option.ask_value(set_it=False) is False
def test_boolean_option_to_config():
config = Config("Test app")
section = config.add_section("test")
default = True
option = section.add_option(BooleanOption, "test_bool", default=default)
assert option.to_config(True) == "true"
assert option.to_config(False) == "false"
def test_boolean_option_export_to_config(config_with_file):
section = config_with_file.add_section("test")
name = "test_bool"
comment = "Test boolean"
default = True
option = section.add_option(BooleanOption, name, default=default, comment=comment)
assert (
option.export_to_config()
== f"""# {comment}
# Default: {str(default).lower()}
# {name} =
"""
)
option.set(not default)
assert (
option.export_to_config()
== f"""# {comment}
# Default: {str(default).lower()}
{name} = {str(not default).lower()}
"""
)
option.set(default)
assert (
option.export_to_config()
== f"""# {comment}
# Default: {str(default).lower()}
# {name} =
"""
)

491
tests/test_mysql.py Normal file
View file

@ -0,0 +1,491 @@
# pylint: disable=redefined-outer-name,missing-function-docstring,protected-access
""" Tests on opening hours helpers """
import pytest
from MySQLdb._exceptions import Error
from mylib.mysql import MyDB
class FakeMySQLdbCursor:
"""Fake MySQLdb cursor"""
def __init__(
self, expected_sql, expected_params, expected_return, expected_just_try, expected_exception
):
self.expected_sql = expected_sql
self.expected_params = expected_params
self.expected_return = expected_return
self.expected_just_try = expected_just_try
self.expected_exception = expected_exception
def execute(self, sql, params=None):
if self.expected_exception:
raise Error(f"{self}.execute({sql}, {params}): expected exception")
if self.expected_just_try and not sql.lower().startswith("select "):
assert False, f"{self}.execute({sql}, {params}) may not be executed in just try mode"
# pylint: disable=consider-using-f-string
assert (
sql == self.expected_sql
), "%s.execute(): Invalid SQL query:\n '%s'\nMay be:\n '%s'" % (
self,
sql,
self.expected_sql,
)
# pylint: disable=consider-using-f-string
assert (
params == self.expected_params
), "%s.execute(): Invalid params:\n %s\nMay be:\n %s" % (
self,
params,
self.expected_params,
)
return self.expected_return
@property
def description(self):
assert self.expected_return
assert isinstance(self.expected_return, list)
assert isinstance(self.expected_return[0], dict)
return [(field, 1, 2, 3) for field in self.expected_return[0].keys()]
def fetchall(self):
if isinstance(self.expected_return, list):
return (
list(row.values()) if isinstance(row, dict) else row for row in self.expected_return
)
return self.expected_return
def __repr__(self):
return (
f"FakeMySQLdbCursor({self.expected_sql}, {self.expected_params}, "
f"{self.expected_return}, {self.expected_just_try})"
)
class FakeMySQLdb:
"""Fake MySQLdb connection"""
expected_sql = None
expected_params = None
expected_return = True
expected_just_try = False
expected_exception = False
just_try = False
def __init__(self, **kwargs):
allowed_kwargs = {
"db": str,
"user": str,
"passwd": (str, None),
"host": str,
"charset": str,
"use_unicode": bool,
}
for arg, value in kwargs.items():
assert arg in allowed_kwargs, f'Invalid arg {arg}="{value}"'
assert isinstance(
value, allowed_kwargs[arg]
), f"Arg {arg} not a {allowed_kwargs[arg]} ({type(value)})"
setattr(self, arg, value)
def close(self):
return self.expected_return
def cursor(self):
return FakeMySQLdbCursor(
self.expected_sql,
self.expected_params,
self.expected_return,
self.expected_just_try or self.just_try,
self.expected_exception,
)
def commit(self):
self._check_just_try()
return self.expected_return
def rollback(self):
self._check_just_try()
return self.expected_return
def _check_just_try(self):
if self.just_try:
assert False, "May not be executed in just try mode"
def fake_mysqldb_connect(**kwargs):
return FakeMySQLdb(**kwargs)
def fake_mysqldb_connect_just_try(**kwargs):
con = FakeMySQLdb(**kwargs)
con.just_try = True
return con
@pytest.fixture
def test_mydb():
return MyDB("127.0.0.1", "user", "password", "dbname")
@pytest.fixture
def fake_mydb(mocker):
mocker.patch("MySQLdb.connect", fake_mysqldb_connect)
return MyDB("127.0.0.1", "user", "password", "dbname")
@pytest.fixture
def fake_just_try_mydb(mocker):
mocker.patch("MySQLdb.connect", fake_mysqldb_connect_just_try)
return MyDB("127.0.0.1", "user", "password", "dbname", just_try=True)
@pytest.fixture
def fake_connected_mydb(fake_mydb):
fake_mydb.connect()
return fake_mydb
@pytest.fixture
def fake_connected_just_try_mydb(fake_just_try_mydb):
fake_just_try_mydb.connect()
return fake_just_try_mydb
def generate_mock_args(
expected_args=(), expected_kwargs={}, expected_return=True
): # pylint: disable=dangerous-default-value
def mock_args(*args, **kwargs):
# pylint: disable=consider-using-f-string
assert args == expected_args, "Invalid call args:\n %s\nMay be:\n %s" % (
args,
expected_args,
)
# pylint: disable=consider-using-f-string
assert kwargs == expected_kwargs, "Invalid call kwargs:\n %s\nMay be:\n %s" % (
kwargs,
expected_kwargs,
)
return expected_return
return mock_args
def mock_doSQL_just_try(self, sql, params=None): # pylint: disable=unused-argument
assert False, "doSQL() may not be executed in just try mode"
def generate_mock_doSQL(
expected_sql, expected_params={}, expected_return=True
): # pylint: disable=dangerous-default-value
def mock_doSQL(self, sql, params=None): # pylint: disable=unused-argument
# pylint: disable=consider-using-f-string
assert sql == expected_sql, "Invalid generated SQL query:\n '%s'\nMay be:\n '%s'" % (
sql,
expected_sql,
)
# pylint: disable=consider-using-f-string
assert params == expected_params, "Invalid generated params:\n %s\nMay be:\n %s" % (
params,
expected_params,
)
return expected_return
return mock_doSQL
# MyDB.doSelect() have same expected parameters as MyDB.doSQL()
generate_mock_doSelect = generate_mock_doSQL
mock_doSelect_just_try = mock_doSQL_just_try
#
# Test on MyDB helper methods
#
def test_combine_params_with_to_add_parameter():
assert MyDB._combine_params({"test1": 1}, {"test2": 2}) == {"test1": 1, "test2": 2}
def test_combine_params_with_kargs():
assert MyDB._combine_params({"test1": 1}, test2=2) == {"test1": 1, "test2": 2}
def test_combine_params_with_kargs_and_to_add_parameter():
assert MyDB._combine_params({"test1": 1}, {"test2": 2}, test3=3) == {
"test1": 1,
"test2": 2,
"test3": 3,
}
def test_format_where_clauses_params_are_preserved():
args = ("test = test", {"test1": 1})
assert MyDB._format_where_clauses(*args) == args
def test_format_where_clauses_raw():
assert MyDB._format_where_clauses("test = test") == ("test = test", {})
def test_format_where_clauses_tuple_clause_with_params():
where_clauses = ("test1 = %(test1)s AND test2 = %(test2)s", {"test1": 1, "test2": 2})
assert MyDB._format_where_clauses(where_clauses) == where_clauses
def test_format_where_clauses_dict():
where_clauses = {"test1": 1, "test2": 2}
assert MyDB._format_where_clauses(where_clauses) == (
"`test1` = %(test1)s AND `test2` = %(test2)s",
where_clauses,
)
def test_format_where_clauses_combined_types():
where_clauses = ("test1 = 1", ("test2 LIKE %(test2)s", {"test2": 2}), {"test3": 3, "test4": 4})
assert MyDB._format_where_clauses(where_clauses) == (
"test1 = 1 AND test2 LIKE %(test2)s AND `test3` = %(test3)s AND `test4` = %(test4)s",
{"test2": 2, "test3": 3, "test4": 4},
)
def test_format_where_clauses_with_where_op():
where_clauses = {"test1": 1, "test2": 2}
assert MyDB._format_where_clauses(where_clauses, where_op="OR") == (
"`test1` = %(test1)s OR `test2` = %(test2)s",
where_clauses,
)
def test_add_where_clauses():
sql = "SELECT * FROM table"
where_clauses = {"test1": 1, "test2": 2}
assert MyDB._add_where_clauses(sql, None, where_clauses) == (
sql + " WHERE `test1` = %(test1)s AND `test2` = %(test2)s",
where_clauses,
)
def test_add_where_clauses_preserved_params():
sql = "SELECT * FROM table"
where_clauses = {"test1": 1, "test2": 2}
params = {"fake1": 1}
assert MyDB._add_where_clauses(sql, params.copy(), where_clauses) == (
sql + " WHERE `test1` = %(test1)s AND `test2` = %(test2)s",
{**where_clauses, **params},
)
def test_add_where_clauses_with_op():
sql = "SELECT * FROM table"
where_clauses = ("test1=1", "test2=2")
assert MyDB._add_where_clauses(sql, None, where_clauses, where_op="OR") == (
sql + " WHERE test1=1 OR test2=2",
{},
)
def test_add_where_clauses_with_duplicated_field():
sql = "UPDATE table SET test1=%(test1)s"
params = {"test1": "new_value"}
where_clauses = {"test1": "where_value"}
assert MyDB._add_where_clauses(sql, params, where_clauses) == (
sql + " WHERE `test1` = %(test1_1)s",
{"test1": "new_value", "test1_1": "where_value"},
)
def test_quote_table_name():
assert MyDB._quote_table_name("mytable") == "`mytable`"
assert MyDB._quote_table_name("myschema.mytable") == "`myschema`.`mytable`"
def test_insert(mocker, test_mydb):
values = {"test1": 1, "test2": 2}
mocker.patch(
"mylib.mysql.MyDB.doSQL",
generate_mock_doSQL(
"INSERT INTO `mytable` (`test1`, `test2`) VALUES (%(test1)s, %(test2)s)", values
),
)
assert test_mydb.insert("mytable", values)
def test_insert_just_try(mocker, test_mydb):
mocker.patch("mylib.mysql.MyDB.doSQL", mock_doSQL_just_try)
assert test_mydb.insert("mytable", {"test1": 1, "test2": 2}, just_try=True)
def test_update(mocker, test_mydb):
values = {"test1": 1, "test2": 2}
where_clauses = {"test3": 3, "test4": 4}
mocker.patch(
"mylib.mysql.MyDB.doSQL",
generate_mock_doSQL(
"UPDATE `mytable` SET `test1` = %(test1)s, `test2` = %(test2)s WHERE `test3` ="
" %(test3)s AND `test4` = %(test4)s",
{**values, **where_clauses},
),
)
assert test_mydb.update("mytable", values, where_clauses)
def test_update_just_try(mocker, test_mydb):
mocker.patch("mylib.mysql.MyDB.doSQL", mock_doSQL_just_try)
assert test_mydb.update("mytable", {"test1": 1, "test2": 2}, None, just_try=True)
def test_delete(mocker, test_mydb):
where_clauses = {"test1": 1, "test2": 2}
mocker.patch(
"mylib.mysql.MyDB.doSQL",
generate_mock_doSQL(
"DELETE FROM `mytable` WHERE `test1` = %(test1)s AND `test2` = %(test2)s", where_clauses
),
)
assert test_mydb.delete("mytable", where_clauses)
def test_delete_just_try(mocker, test_mydb):
mocker.patch("mylib.mysql.MyDB.doSQL", mock_doSQL_just_try)
assert test_mydb.delete("mytable", None, just_try=True)
def test_truncate(mocker, test_mydb):
mocker.patch("mylib.mysql.MyDB.doSQL", generate_mock_doSQL("TRUNCATE TABLE `mytable`", None))
assert test_mydb.truncate("mytable")
def test_truncate_just_try(mocker, test_mydb):
mocker.patch("mylib.mysql.MyDB.doSQL", mock_doSelect_just_try)
assert test_mydb.truncate("mytable", just_try=True)
def test_select(mocker, test_mydb):
fields = ("field1", "field2")
where_clauses = {"test3": 3, "test4": 4}
expected_return = [
{"field1": 1, "field2": 2},
{"field1": 2, "field2": 3},
]
order_by = "field1, DESC"
limit = 10
mocker.patch(
"mylib.mysql.MyDB.doSelect",
generate_mock_doSQL(
"SELECT `field1`, `field2` FROM `mytable` WHERE `test3` = %(test3)s AND `test4` ="
" %(test4)s ORDER BY " + order_by + " LIMIT " + str(limit), # nosec: B608
where_clauses,
expected_return,
),
)
assert (
test_mydb.select("mytable", where_clauses, fields, order_by=order_by, limit=limit)
== expected_return
)
def test_select_without_field_and_order_by(mocker, test_mydb):
mocker.patch("mylib.mysql.MyDB.doSelect", generate_mock_doSQL("SELECT * FROM `mytable`"))
assert test_mydb.select("mytable")
def test_select_just_try(mocker, test_mydb):
mocker.patch("mylib.mysql.MyDB.doSQL", mock_doSelect_just_try)
assert test_mydb.select("mytable", None, None, just_try=True)
#
# Tests on main methods
#
def test_connect(mocker, test_mydb):
expected_kwargs = {
"db": test_mydb._db,
"user": test_mydb._user,
"host": test_mydb._host,
"passwd": test_mydb._pwd,
"charset": test_mydb._charset,
"use_unicode": True,
}
mocker.patch("MySQLdb.connect", generate_mock_args(expected_kwargs=expected_kwargs))
assert test_mydb.connect()
def test_close(fake_mydb):
assert fake_mydb.close() is None
def test_close_connected(fake_connected_mydb):
assert fake_connected_mydb.close() is None
def test_doSQL(fake_connected_mydb):
fake_connected_mydb._conn.expected_sql = "DELETE FROM table WHERE test1 = %(test1)s"
fake_connected_mydb._conn.expected_params = {"test1": 1}
fake_connected_mydb.doSQL(
fake_connected_mydb._conn.expected_sql, fake_connected_mydb._conn.expected_params
)
def test_doSQL_without_params(fake_connected_mydb):
fake_connected_mydb._conn.expected_sql = "DELETE FROM table"
fake_connected_mydb.doSQL(fake_connected_mydb._conn.expected_sql)
def test_doSQL_just_try(fake_connected_just_try_mydb):
assert fake_connected_just_try_mydb.doSQL("DELETE FROM table")
def test_doSQL_on_exception(fake_connected_mydb):
fake_connected_mydb._conn.expected_exception = True
assert fake_connected_mydb.doSQL("DELETE FROM table") is False
def test_doSelect(fake_connected_mydb):
fake_connected_mydb._conn.expected_sql = "SELECT * FROM table WHERE test1 = %(test1)s"
fake_connected_mydb._conn.expected_params = {"test1": 1}
fake_connected_mydb._conn.expected_return = [{"test1": 1}]
assert (
fake_connected_mydb.doSelect(
fake_connected_mydb._conn.expected_sql, fake_connected_mydb._conn.expected_params
)
== fake_connected_mydb._conn.expected_return
)
def test_doSelect_without_params(fake_connected_mydb):
fake_connected_mydb._conn.expected_sql = "SELECT * FROM table"
fake_connected_mydb._conn.expected_return = [{"test1": 1}]
assert (
fake_connected_mydb.doSelect(fake_connected_mydb._conn.expected_sql)
== fake_connected_mydb._conn.expected_return
)
def test_doSelect_on_exception(fake_connected_mydb):
fake_connected_mydb._conn.expected_exception = True
assert fake_connected_mydb.doSelect("SELECT * FROM table") is False
def test_doSelect_just_try(fake_connected_just_try_mydb):
fake_connected_just_try_mydb._conn.expected_sql = "SELECT * FROM table WHERE test1 = %(test1)s"
fake_connected_just_try_mydb._conn.expected_params = {"test1": 1}
fake_connected_just_try_mydb._conn.expected_return = [{"test1": 1}]
assert (
fake_connected_just_try_mydb.doSelect(
fake_connected_just_try_mydb._conn.expected_sql,
fake_connected_just_try_mydb._conn.expected_params,
)
== fake_connected_just_try_mydb._conn.expected_return
)

View file

@ -2,6 +2,7 @@
""" Tests on opening hours helpers """
import datetime
import pytest
from mylib import opening_hours
@ -12,14 +13,16 @@ from mylib import opening_hours
def test_parse_exceptional_closures_one_day_without_time_period():
assert opening_hours.parse_exceptional_closures(["22/09/2017"]) == [{'days': [datetime.date(2017, 9, 22)], 'hours_periods': []}]
assert opening_hours.parse_exceptional_closures(["22/09/2017"]) == [
{"days": [datetime.date(2017, 9, 22)], "hours_periods": []}
]
def test_parse_exceptional_closures_one_day_with_time_period():
assert opening_hours.parse_exceptional_closures(["26/11/2017 9h30-12h30"]) == [
{
'days': [datetime.date(2017, 11, 26)],
'hours_periods': [{'start': datetime.time(9, 30), 'stop': datetime.time(12, 30)}]
"days": [datetime.date(2017, 11, 26)],
"hours_periods": [{"start": datetime.time(9, 30), "stop": datetime.time(12, 30)}],
}
]
@ -27,11 +30,11 @@ def test_parse_exceptional_closures_one_day_with_time_period():
def test_parse_exceptional_closures_one_day_with_multiple_time_periods():
assert opening_hours.parse_exceptional_closures(["26/11/2017 9h30-12h30 14h-18h"]) == [
{
'days': [datetime.date(2017, 11, 26)],
'hours_periods': [
{'start': datetime.time(9, 30), 'stop': datetime.time(12, 30)},
{'start': datetime.time(14, 0), 'stop': datetime.time(18, 0)},
]
"days": [datetime.date(2017, 11, 26)],
"hours_periods": [
{"start": datetime.time(9, 30), "stop": datetime.time(12, 30)},
{"start": datetime.time(14, 0), "stop": datetime.time(18, 0)},
],
}
]
@ -39,8 +42,12 @@ def test_parse_exceptional_closures_one_day_with_multiple_time_periods():
def test_parse_exceptional_closures_full_days_period():
assert opening_hours.parse_exceptional_closures(["20/09/2017-22/09/2017"]) == [
{
'days': [datetime.date(2017, 9, 20), datetime.date(2017, 9, 21), datetime.date(2017, 9, 22)],
'hours_periods': []
"days": [
datetime.date(2017, 9, 20),
datetime.date(2017, 9, 21),
datetime.date(2017, 9, 22),
],
"hours_periods": [],
}
]
@ -53,8 +60,12 @@ def test_parse_exceptional_closures_invalid_days_period():
def test_parse_exceptional_closures_days_period_with_time_period():
assert opening_hours.parse_exceptional_closures(["20/09/2017-22/09/2017 9h-12h"]) == [
{
'days': [datetime.date(2017, 9, 20), datetime.date(2017, 9, 21), datetime.date(2017, 9, 22)],
'hours_periods': [{'start': datetime.time(9, 0), 'stop': datetime.time(12, 0)}]
"days": [
datetime.date(2017, 9, 20),
datetime.date(2017, 9, 21),
datetime.date(2017, 9, 22),
],
"hours_periods": [{"start": datetime.time(9, 0), "stop": datetime.time(12, 0)}],
}
]
@ -70,31 +81,38 @@ def test_parse_exceptional_closures_invalid_time_period():
def test_parse_exceptional_closures_multiple_periods():
assert opening_hours.parse_exceptional_closures(["20/09/2017 25/11/2017-26/11/2017 9h30-12h30 14h-18h"]) == [
assert opening_hours.parse_exceptional_closures(
["20/09/2017 25/11/2017-26/11/2017 9h30-12h30 14h-18h"]
) == [
{
'days': [
"days": [
datetime.date(2017, 9, 20),
datetime.date(2017, 11, 25),
datetime.date(2017, 11, 26),
],
'hours_periods': [
{'start': datetime.time(9, 30), 'stop': datetime.time(12, 30)},
{'start': datetime.time(14, 0), 'stop': datetime.time(18, 0)},
]
"hours_periods": [
{"start": datetime.time(9, 30), "stop": datetime.time(12, 30)},
{"start": datetime.time(14, 0), "stop": datetime.time(18, 0)},
],
}
]
#
# Tests on parse_normal_opening_hours()
#
def test_parse_normal_opening_hours_one_day():
assert opening_hours.parse_normal_opening_hours(["jeudi"]) == [{'days': ["jeudi"], 'hours_periods': []}]
assert opening_hours.parse_normal_opening_hours(["jeudi"]) == [
{"days": ["jeudi"], "hours_periods": []}
]
def test_parse_normal_opening_hours_multiple_days():
assert opening_hours.parse_normal_opening_hours(["lundi jeudi"]) == [{'days': ["lundi", "jeudi"], 'hours_periods': []}]
assert opening_hours.parse_normal_opening_hours(["lundi jeudi"]) == [
{"days": ["lundi", "jeudi"], "hours_periods": []}
]
def test_parse_normal_opening_hours_invalid_day():
@ -104,13 +122,17 @@ def test_parse_normal_opening_hours_invalid_day():
def test_parse_normal_opening_hours_one_days_period():
assert opening_hours.parse_normal_opening_hours(["lundi-jeudi"]) == [
{'days': ["lundi", "mardi", "mercredi", "jeudi"], 'hours_periods': []}
{"days": ["lundi", "mardi", "mercredi", "jeudi"], "hours_periods": []}
]
def test_parse_normal_opening_hours_one_day_with_one_time_period():
assert opening_hours.parse_normal_opening_hours(["jeudi 9h-12h"]) == [
{'days': ["jeudi"], 'hours_periods': [{'start': datetime.time(9, 0), 'stop': datetime.time(12, 0)}]}]
{
"days": ["jeudi"],
"hours_periods": [{"start": datetime.time(9, 0), "stop": datetime.time(12, 0)}],
}
]
def test_parse_normal_opening_hours_invalid_days_period():
@ -122,7 +144,10 @@ def test_parse_normal_opening_hours_invalid_days_period():
def test_parse_normal_opening_hours_one_time_period():
assert opening_hours.parse_normal_opening_hours(["9h-18h30"]) == [
{'days': [], 'hours_periods': [{'start': datetime.time(9, 0), 'stop': datetime.time(18, 30)}]}
{
"days": [],
"hours_periods": [{"start": datetime.time(9, 0), "stop": datetime.time(18, 30)}],
}
]
@ -132,62 +157,253 @@ def test_parse_normal_opening_hours_invalid_time_period():
def test_parse_normal_opening_hours_multiple_periods():
assert opening_hours.parse_normal_opening_hours(["lundi-vendredi 9h30-12h30 14h-18h", "samedi 9h30-18h", "dimanche 9h30-12h"]) == [
assert opening_hours.parse_normal_opening_hours(
["lundi-vendredi 9h30-12h30 14h-18h", "samedi 9h30-18h", "dimanche 9h30-12h"]
) == [
{
'days': ['lundi', 'mardi', 'mercredi', 'jeudi', 'vendredi'],
'hours_periods': [
{'start': datetime.time(9, 30), 'stop': datetime.time(12, 30)},
{'start': datetime.time(14, 0), 'stop': datetime.time(18, 0)},
]
"days": ["lundi", "mardi", "mercredi", "jeudi", "vendredi"],
"hours_periods": [
{"start": datetime.time(9, 30), "stop": datetime.time(12, 30)},
{"start": datetime.time(14, 0), "stop": datetime.time(18, 0)},
],
},
{
'days': ['samedi'],
'hours_periods': [
{'start': datetime.time(9, 30), 'stop': datetime.time(18, 0)},
]
"days": ["samedi"],
"hours_periods": [
{"start": datetime.time(9, 30), "stop": datetime.time(18, 0)},
],
},
{
'days': ['dimanche'],
'hours_periods': [
{'start': datetime.time(9, 30), 'stop': datetime.time(12, 0)},
]
"days": ["dimanche"],
"hours_periods": [
{"start": datetime.time(9, 30), "stop": datetime.time(12, 0)},
],
},
]
def test_parse_normal_opening_hours_is_sorted():
assert opening_hours.parse_normal_opening_hours(
[
"samedi 9h30-18h",
"lundi-vendredi 14h-18h 9h30-12h30",
"samedi 9h30-12h",
"dimanche 9h30-12h",
]
) == [
{
"days": ["lundi", "mardi", "mercredi", "jeudi", "vendredi"],
"hours_periods": [
{"start": datetime.time(9, 30), "stop": datetime.time(12, 30)},
{"start": datetime.time(14, 0), "stop": datetime.time(18, 0)},
],
},
{
"days": ["samedi"],
"hours_periods": [
{"start": datetime.time(9, 30), "stop": datetime.time(12, 0)},
],
},
{
"days": ["samedi"],
"hours_periods": [
{"start": datetime.time(9, 30), "stop": datetime.time(18, 0)},
],
},
{
"days": ["dimanche"],
"hours_periods": [
{"start": datetime.time(9, 30), "stop": datetime.time(12, 0)},
],
},
]
#
# Tests on normal opening hours
#
normal_opening_hours = [
"lundi-mardi jeudi 9h30-12h30 14h-16h30",
"mercredi vendredi 9h30-12h30 14h-17h",
"samedi",
]
normally_opened_datetime = datetime.datetime(2024, 3, 1, 10, 15)
normally_opened_all_day_datetime = datetime.datetime(2024, 4, 6, 10, 15)
normally_closed_datetime = datetime.datetime(2017, 3, 1, 20, 15)
normally_closed_all_day_datetime = datetime.datetime(2024, 4, 7, 20, 15)
def test_its_normally_open():
assert opening_hours.its_normally_open(normal_opening_hours, when=normally_opened_datetime)
def test_its_normally_open_all_day():
assert opening_hours.its_normally_open(
normal_opening_hours, when=normally_opened_all_day_datetime
)
def test_its_normally_closed():
assert not opening_hours.its_normally_open(normal_opening_hours, when=normally_closed_datetime)
def test_its_normally_closed_all_day():
assert not opening_hours.its_normally_open(
normal_opening_hours, when=normally_closed_all_day_datetime
)
def test_its_normally_open_ignore_time():
assert opening_hours.its_normally_open(
normal_opening_hours, when=normally_closed_datetime.date(), ignore_time=True
)
def test_its_normally_closed_ignore_time():
assert not opening_hours.its_normally_open(
normal_opening_hours, when=normally_closed_all_day_datetime.date(), ignore_time=True
)
#
# Tests on non working days
#
nonworking_public_holidays = [
"1janvier",
"paques",
"lundi_paques",
"8mai",
"jeudi_ascension",
"lundi_pentecote",
"14juillet",
"15aout",
"1novembre",
"11novembre",
"noel",
]
nonworking_date = datetime.date(2017, 1, 1)
not_included_nonworking_date = datetime.date(2017, 5, 1)
not_nonworking_date = datetime.date(2017, 5, 2)
def test_its_nonworking_day():
assert (
opening_hours.its_nonworking_day(nonworking_public_holidays, date=nonworking_date) is True
)
def test_its_not_nonworking_day():
assert (
opening_hours.its_nonworking_day(
nonworking_public_holidays,
date=not_nonworking_date,
)
is False
)
def test_its_not_included_nonworking_day():
assert (
opening_hours.its_nonworking_day(
nonworking_public_holidays,
date=not_included_nonworking_date,
)
is False
)
#
# Tests in exceptional closures
#
exceptional_closures = [
"22/09/2017",
"20/09/2017-22/09/2017",
"20/09/2017-22/09/2017 18/09/2017",
"25/11/2017",
"26/11/2017 9h30-12h30",
"27/11/2017 17h-18h 9h30-12h30",
]
exceptional_closure_all_day_date = datetime.date(2017, 9, 22)
exceptional_closure_all_day_datetime = datetime.datetime.combine(
exceptional_closure_all_day_date, datetime.time(20, 15)
)
exceptional_closure_datetime = datetime.datetime(2017, 11, 26, 10, 30)
exceptional_closure_datetime_hours_period = {
"start": datetime.time(9, 30),
"stop": datetime.time(12, 30),
}
not_exceptional_closure_date = datetime.date(2019, 9, 22)
def test_its_exceptionally_closed():
assert (
opening_hours.its_exceptionally_closed(
exceptional_closures, when=exceptional_closure_all_day_datetime
)
is True
)
def test_its_not_exceptionally_closed():
assert (
opening_hours.its_exceptionally_closed(
exceptional_closures, when=not_exceptional_closure_date
)
is False
)
def test_its_exceptionally_closed_all_day():
assert (
opening_hours.its_exceptionally_closed(
exceptional_closures, when=exceptional_closure_all_day_datetime, all_day=True
)
is True
)
def test_its_not_exceptionally_closed_all_day():
assert (
opening_hours.its_exceptionally_closed(
exceptional_closures, when=exceptional_closure_datetime, all_day=True
)
is False
)
def test_get_exceptional_closures_hours():
assert opening_hours.get_exceptional_closures_hours(
exceptional_closures, date=exceptional_closure_datetime.date()
) == [exceptional_closure_datetime_hours_period]
def test_get_exceptional_closures_hours_all_day():
assert opening_hours.get_exceptional_closures_hours(
exceptional_closures, date=exceptional_closure_all_day_date
) == [{"start": datetime.datetime.min.time(), "stop": datetime.datetime.max.time()}]
def test_get_exceptional_closures_hours_is_sorted():
assert opening_hours.get_exceptional_closures_hours(
["27/11/2017 17h-18h 9h30-12h30"], date=datetime.date(2017, 11, 27)
) == [
{"start": datetime.time(9, 30), "stop": datetime.time(12, 30)},
{"start": datetime.time(17, 0), "stop": datetime.time(18, 0)},
]
#
# Tests on is_closed
#
exceptional_closures = ["22/09/2017", "20/09/2017-22/09/2017", "20/09/2017-22/09/2017 18/09/2017", "25/11/2017", "26/11/2017 9h30-12h30"]
normal_opening_hours = ["lundi-mardi jeudi 9h30-12h30 14h-16h30", "mercredi vendredi 9h30-12h30 14h-17h"]
nonworking_public_holidays = [
'1janvier',
'paques',
'lundi_paques',
'1mai',
'8mai',
'jeudi_ascension',
'lundi_pentecote',
'14juillet',
'15aout',
'1novembre',
'11novembre',
'noel',
]
def test_is_closed_when_normaly_closed_by_hour():
assert opening_hours.is_closed(
normal_opening_hours_values=normal_opening_hours,
exceptional_closures_values=exceptional_closures,
nonworking_public_holidays_values=nonworking_public_holidays,
when=datetime.datetime(2017, 5, 1, 20, 15)
) == {
'closed': True,
'exceptional_closure': False,
'exceptional_closure_all_day': False
}
when=datetime.datetime(2017, 5, 1, 20, 15),
) == {"closed": True, "exceptional_closure": False, "exceptional_closure_all_day": False}
def test_is_closed_on_exceptional_closure_full_day():
@ -195,12 +411,8 @@ def test_is_closed_on_exceptional_closure_full_day():
normal_opening_hours_values=normal_opening_hours,
exceptional_closures_values=exceptional_closures,
nonworking_public_holidays_values=nonworking_public_holidays,
when=datetime.datetime(2017, 9, 22, 14, 15)
) == {
'closed': True,
'exceptional_closure': True,
'exceptional_closure_all_day': True
}
when=datetime.datetime(2017, 9, 22, 14, 15),
) == {"closed": True, "exceptional_closure": True, "exceptional_closure_all_day": True}
def test_is_closed_on_exceptional_closure_day():
@ -208,12 +420,8 @@ def test_is_closed_on_exceptional_closure_day():
normal_opening_hours_values=normal_opening_hours,
exceptional_closures_values=exceptional_closures,
nonworking_public_holidays_values=nonworking_public_holidays,
when=datetime.datetime(2017, 11, 26, 10, 30)
) == {
'closed': True,
'exceptional_closure': True,
'exceptional_closure_all_day': False
}
when=datetime.datetime(2017, 11, 26, 10, 30),
) == {"closed": True, "exceptional_closure": True, "exceptional_closure_all_day": False}
def test_is_closed_on_nonworking_public_holidays():
@ -221,12 +429,8 @@ def test_is_closed_on_nonworking_public_holidays():
normal_opening_hours_values=normal_opening_hours,
exceptional_closures_values=exceptional_closures,
nonworking_public_holidays_values=nonworking_public_holidays,
when=datetime.datetime(2017, 1, 1, 10, 30)
) == {
'closed': True,
'exceptional_closure': False,
'exceptional_closure_all_day': False
}
when=datetime.datetime(2017, 1, 1, 10, 30),
) == {"closed": True, "exceptional_closure": False, "exceptional_closure_all_day": False}
def test_is_closed_when_normaly_closed_by_day():
@ -234,12 +438,8 @@ def test_is_closed_when_normaly_closed_by_day():
normal_opening_hours_values=normal_opening_hours,
exceptional_closures_values=exceptional_closures,
nonworking_public_holidays_values=nonworking_public_holidays,
when=datetime.datetime(2017, 5, 6, 14, 15)
) == {
'closed': True,
'exceptional_closure': False,
'exceptional_closure_all_day': False
}
when=datetime.datetime(2017, 5, 7, 14, 15),
) == {"closed": True, "exceptional_closure": False, "exceptional_closure_all_day": False}
def test_is_closed_when_normaly_opened():
@ -247,12 +447,8 @@ def test_is_closed_when_normaly_opened():
normal_opening_hours_values=normal_opening_hours,
exceptional_closures_values=exceptional_closures,
nonworking_public_holidays_values=nonworking_public_holidays,
when=datetime.datetime(2017, 5, 2, 15, 15)
) == {
'closed': False,
'exceptional_closure': False,
'exceptional_closure_all_day': False
}
when=datetime.datetime(2017, 5, 2, 15, 15),
) == {"closed": False, "exceptional_closure": False, "exceptional_closure_all_day": False}
def test_easter_date():
@ -272,18 +468,218 @@ def test_easter_date():
def test_nonworking_french_public_days_of_the_year():
assert opening_hours.nonworking_french_public_days_of_the_year(2021) == {
'1janvier': datetime.date(2021, 1, 1),
'paques': datetime.date(2021, 4, 4),
'lundi_paques': datetime.date(2021, 4, 5),
'1mai': datetime.date(2021, 5, 1),
'8mai': datetime.date(2021, 5, 8),
'jeudi_ascension': datetime.date(2021, 5, 13),
'pentecote': datetime.date(2021, 5, 23),
'lundi_pentecote': datetime.date(2021, 5, 24),
'14juillet': datetime.date(2021, 7, 14),
'15aout': datetime.date(2021, 8, 15),
'1novembre': datetime.date(2021, 11, 1),
'11novembre': datetime.date(2021, 11, 11),
'noel': datetime.date(2021, 12, 25),
'saint_etienne': datetime.date(2021, 12, 26)
"1janvier": datetime.date(2021, 1, 1),
"paques": datetime.date(2021, 4, 4),
"lundi_paques": datetime.date(2021, 4, 5),
"1mai": datetime.date(2021, 5, 1),
"8mai": datetime.date(2021, 5, 8),
"jeudi_ascension": datetime.date(2021, 5, 13),
"pentecote": datetime.date(2021, 5, 23),
"lundi_pentecote": datetime.date(2021, 5, 24),
"14juillet": datetime.date(2021, 7, 14),
"15aout": datetime.date(2021, 8, 15),
"1novembre": datetime.date(2021, 11, 1),
"11novembre": datetime.date(2021, 11, 11),
"noel": datetime.date(2021, 12, 25),
"saint_etienne": datetime.date(2021, 12, 26),
}
def test_next_opening_date():
assert opening_hours.next_opening_date(
normal_opening_hours_values=normal_opening_hours,
exceptional_closures_values=exceptional_closures,
nonworking_public_holidays_values=nonworking_public_holidays,
date=datetime.date(2021, 4, 4),
) == datetime.date(2021, 4, 6)
def test_next_opening_hour():
assert opening_hours.next_opening_hour(
normal_opening_hours_values=normal_opening_hours,
exceptional_closures_values=exceptional_closures,
nonworking_public_holidays_values=nonworking_public_holidays,
when=datetime.datetime(2021, 4, 4, 10, 30),
) == datetime.datetime(2021, 4, 6, 9, 30)
def test_next_opening_hour_with_exceptionnal_closure_hours():
assert opening_hours.next_opening_hour(
normal_opening_hours_values=["lundi-vendredi 9h-12h 14h-18h"],
exceptional_closures_values=["06/04/2021 9h-13h 14h-16h"],
nonworking_public_holidays_values=nonworking_public_holidays,
when=datetime.datetime(2021, 4, 4, 10, 30),
) == datetime.datetime(2021, 4, 6, 16, 0)
def test_next_opening_hour_with_exceptionnal_closure_day():
assert opening_hours.next_opening_hour(
normal_opening_hours_values=["lundi-vendredi 9h-12h 14h-18h"],
exceptional_closures_values=["06/04/2021"],
nonworking_public_holidays_values=nonworking_public_holidays,
when=datetime.datetime(2021, 4, 4, 10, 30),
) == datetime.datetime(2021, 4, 7, 9, 0)
def test_next_opening_hour_with_overlapsed_opening_hours():
assert opening_hours.next_opening_hour(
normal_opening_hours_values=["lundi-vendredi 9h-12h 14h-18h", "mardi 8h-19h"],
exceptional_closures_values=["06/04/2021 9h-13h 14h-16h"],
nonworking_public_holidays_values=nonworking_public_holidays,
when=datetime.datetime(2021, 4, 4, 10, 30),
) == datetime.datetime(2021, 4, 6, 8, 0)
def test_next_opening_hour_with_too_large_exceptionnal_closure_days():
assert (
opening_hours.next_opening_hour(
normal_opening_hours_values=["lundi-vendredi 9h-12h 14h-18h"],
exceptional_closures_values=["06/04/2021-16-04/2021"],
nonworking_public_holidays_values=nonworking_public_holidays,
when=datetime.datetime(2021, 4, 4, 10, 30),
max_anaylse_days=10,
)
is False
)
def test_next_opening_hour_on_opened_moment():
assert opening_hours.next_opening_hour(
normal_opening_hours_values=["lundi-vendredi 9h-12h 14h-18h"],
exceptional_closures_values=[],
nonworking_public_holidays_values=nonworking_public_holidays,
when=datetime.datetime(2021, 4, 6, 10, 30),
) == datetime.datetime(2021, 4, 6, 10, 30)
def test_next_opening_hour_on_same_day():
assert opening_hours.next_opening_hour(
normal_opening_hours_values=["lundi-vendredi 9h-12h 14h-18h"],
exceptional_closures_values=[],
nonworking_public_holidays_values=nonworking_public_holidays,
when=datetime.datetime(2021, 4, 6, 13, 0),
) == datetime.datetime(2021, 4, 6, 14, 0)
assert opening_hours.next_opening_hour(
normal_opening_hours_values=["lundi-vendredi 9h-12h 14h-18h"],
exceptional_closures_values=[],
nonworking_public_holidays_values=nonworking_public_holidays,
when=datetime.datetime(2021, 4, 6, 16, 0),
) == datetime.datetime(2021, 4, 6, 16, 0)
assert opening_hours.next_opening_hour(
normal_opening_hours_values=["lundi-vendredi"],
exceptional_closures_values=[],
nonworking_public_holidays_values=nonworking_public_holidays,
when=datetime.datetime(2021, 4, 6, 16, 0),
) == datetime.datetime(2021, 4, 6, 16, 0)
def test_next_opening_hour_on_opened_day_but_too_late():
assert opening_hours.next_opening_hour(
normal_opening_hours_values=["lundi-vendredi 9h-12h 14h-18h"],
exceptional_closures_values=[],
nonworking_public_holidays_values=nonworking_public_holidays,
when=datetime.datetime(2021, 4, 6, 23, 0),
) == datetime.datetime(2021, 4, 7, 9, 0)
def test_previous_opening_date():
assert opening_hours.previous_opening_date(
normal_opening_hours_values=["lundi-vendredi 9h-18h"],
exceptional_closures_values=[],
nonworking_public_holidays_values=nonworking_public_holidays,
date=datetime.date(2024, 4, 1),
) == datetime.date(2024, 3, 29)
def test_previous_opening_hour():
assert opening_hours.previous_opening_hour(
normal_opening_hours_values=["lundi-vendredi 9h-18h"],
exceptional_closures_values=[],
nonworking_public_holidays_values=nonworking_public_holidays,
when=datetime.datetime(2024, 4, 1, 10, 30),
) == datetime.datetime(2024, 3, 29, 18, 0)
def test_previous_opening_hour_with_exceptionnal_closure_hours():
assert opening_hours.previous_opening_hour(
normal_opening_hours_values=["lundi-vendredi 9h-12h 14h-18h"],
exceptional_closures_values=["29/03/2024 14h-18h"],
nonworking_public_holidays_values=nonworking_public_holidays,
when=datetime.datetime(2024, 4, 1, 10, 30),
) == datetime.datetime(2024, 3, 29, 12, 0)
assert opening_hours.previous_opening_hour(
normal_opening_hours_values=["lundi-vendredi 9h-12h 14h-18h"],
exceptional_closures_values=["29/03/2024 16h-18h"],
nonworking_public_holidays_values=nonworking_public_holidays,
when=datetime.datetime(2024, 4, 1, 10, 30),
) == datetime.datetime(2024, 3, 29, 16, 0)
def test_previous_opening_hour_with_exceptionnal_closure_day():
assert opening_hours.previous_opening_hour(
normal_opening_hours_values=["lundi-vendredi 9h-12h 14h-18h"],
exceptional_closures_values=["29/03/2024"],
nonworking_public_holidays_values=nonworking_public_holidays,
when=datetime.datetime(2024, 4, 1, 10, 30),
) == datetime.datetime(2024, 3, 28, 18, 0)
def test_previous_opening_hour_with_overlapsed_opening_hours():
assert opening_hours.previous_opening_hour(
normal_opening_hours_values=["lundi-vendredi 9h-12h 14h-18h", "mardi 8h-19h"],
exceptional_closures_values=[],
nonworking_public_holidays_values=nonworking_public_holidays,
when=datetime.datetime(2024, 4, 3, 8, 30),
) == datetime.datetime(2024, 4, 2, 19, 0)
def test_previous_opening_hour_with_too_large_exceptionnal_closure_days():
assert (
opening_hours.previous_opening_hour(
normal_opening_hours_values=["lundi-vendredi 9h-12h 14h-18h"],
exceptional_closures_values=["06/03/2024-16-04/2024"],
nonworking_public_holidays_values=nonworking_public_holidays,
when=datetime.datetime(2024, 4, 17, 8, 30),
max_anaylse_days=10,
)
is False
)
def test_previous_opening_hour_on_opened_moment():
assert opening_hours.previous_opening_hour(
normal_opening_hours_values=["lundi-vendredi 9h-12h 14h-18h"],
exceptional_closures_values=[],
nonworking_public_holidays_values=nonworking_public_holidays,
when=datetime.datetime(2024, 4, 5, 10, 30),
) == datetime.datetime(2024, 4, 5, 10, 30)
def test_previous_opening_hour_on_same_day():
assert opening_hours.previous_opening_hour(
normal_opening_hours_values=["lundi-vendredi 9h-12h 14h-18h"],
exceptional_closures_values=[],
nonworking_public_holidays_values=nonworking_public_holidays,
when=datetime.datetime(2024, 4, 5, 13, 0),
) == datetime.datetime(2024, 4, 5, 12, 0)
assert opening_hours.previous_opening_hour(
normal_opening_hours_values=["lundi-vendredi 9h-12h 14h-18h"],
exceptional_closures_values=[],
nonworking_public_holidays_values=nonworking_public_holidays,
when=datetime.datetime(2024, 4, 5, 16, 0),
) == datetime.datetime(2024, 4, 5, 16, 0)
assert opening_hours.previous_opening_hour(
normal_opening_hours_values=["lundi-vendredi"],
exceptional_closures_values=[],
nonworking_public_holidays_values=nonworking_public_holidays,
when=datetime.datetime(2024, 4, 5, 16, 0),
) == datetime.datetime(2024, 4, 5, 16, 0)
def test_previous_opening_hour_on_opened_day_but_too_early():
assert opening_hours.previous_opening_hour(
normal_opening_hours_values=["lundi-vendredi 9h-12h 14h-18h"],
exceptional_closures_values=[],
nonworking_public_holidays_values=nonworking_public_holidays,
when=datetime.datetime(2024, 4, 5, 8, 0),
) == datetime.datetime(2024, 4, 4, 18, 0)

View file

@ -1,55 +1,84 @@
# pylint: disable=redefined-outer-name,missing-function-docstring,protected-access
""" Tests on opening hours helpers """
import cx_Oracle
import pytest
from mylib.oracle import OracleDB
class FakeCXOracleCursor:
""" Fake cx_Oracle cursor """
"""Fake cx_Oracle cursor"""
def __init__(self, expected_sql, expected_params, expected_return, expected_just_try, expected_exception):
def __init__(
self, expected_sql, expected_params, expected_return, expected_just_try, expected_exception
):
self.expected_sql = expected_sql
self.expected_params = expected_params
self.expected_return = expected_return
self.expected_just_try = expected_just_try
self.expected_exception = expected_exception
self.opened = True
def execute(self, sql, **params):
assert self.opened
if self.expected_exception:
raise Exception("%s.execute(%s, %s): expected exception" % (self, sql, params))
if self.expected_just_try and not sql.lower().startswith('select '):
assert False, "%s.execute(%s, %s) may not be executed in just try mode" % (self, sql, params)
assert sql == self.expected_sql, "%s.execute(): Invalid SQL query:\n '%s'\nMay be:\n '%s'" % (self, sql, self.expected_sql)
assert params == self.expected_params, "%s.execute(): Invalid params:\n %s\nMay be:\n %s" % (self, params, self.expected_params)
raise cx_Oracle.Error(f"{self}.execute({sql}, {params}): expected exception")
if self.expected_just_try and not sql.lower().startswith("select "):
assert False, f"{self}.execute({sql}, {params}) may not be executed in just try mode"
# pylint: disable=consider-using-f-string
assert (
sql == self.expected_sql
), "%s.execute(): Invalid SQL query:\n '%s'\nMay be:\n '%s'" % (
self,
sql,
self.expected_sql,
)
# pylint: disable=consider-using-f-string
assert (
params == self.expected_params
), "%s.execute(): Invalid params:\n %s\nMay be:\n %s" % (
self,
params,
self.expected_params,
)
return self.expected_return
def fetchall(self):
assert self.opened
return self.expected_return
def __enter__(self):
self.opened = True
return self
def __exit__(self, *args):
self.opened = False
def __repr__(self):
return "FakeCXOracleCursor(%s, %s, %s, %s)" % (
self.expected_sql, self.expected_params,
self.expected_return, self.expected_just_try
return (
f"FakeCXOracleCursor({self.expected_sql}, {self.expected_params}, "
f"{self.expected_return}, {self.expected_just_try})"
)
class FakeCXOracle:
""" Fake cx_Oracle connection """
"""Fake cx_Oracle connection"""
expected_sql = None
expected_params = dict()
expected_params = {}
expected_return = True
expected_just_try = False
expected_exception = False
just_try = False
def __init__(self, **kwargs):
allowed_kwargs = dict(dsn=str, user=str, password=(str, None))
allowed_kwargs = {"dsn": str, "user": str, "password": (str, None)}
for arg, value in kwargs.items():
assert arg in allowed_kwargs, "Invalid arg %s='%s'" % (arg, value)
assert isinstance(value, allowed_kwargs[arg]), "Arg %s not a %s (%s)" % (arg, allowed_kwargs[arg], type(value))
assert arg in allowed_kwargs, f"Invalid arg {arg}='{value}'"
assert isinstance(
value, allowed_kwargs[arg]
), f"Arg {arg} not a {allowed_kwargs[arg]} ({type(value)})"
setattr(self, arg, value)
def close(self):
@ -57,9 +86,11 @@ class FakeCXOracle:
def cursor(self):
return FakeCXOracleCursor(
self.expected_sql, self.expected_params,
self.expected_return, self.expected_just_try or self.just_try,
self.expected_exception
self.expected_sql,
self.expected_params,
self.expected_return,
self.expected_just_try or self.just_try,
self.expected_exception,
)
def commit(self):
@ -87,19 +118,19 @@ def fake_cxoracle_connect_just_try(**kwargs):
@pytest.fixture
def test_oracledb():
return OracleDB('127.0.0.1/dbname', 'user', 'password')
return OracleDB("127.0.0.1/dbname", "user", "password")
@pytest.fixture
def fake_oracledb(mocker):
mocker.patch('cx_Oracle.connect', fake_cxoracle_connect)
return OracleDB('127.0.0.1/dbname', 'user', 'password')
mocker.patch("cx_Oracle.connect", fake_cxoracle_connect)
return OracleDB("127.0.0.1/dbname", "user", "password")
@pytest.fixture
def fake_just_try_oracledb(mocker):
mocker.patch('cx_Oracle.connect', fake_cxoracle_connect_just_try)
return OracleDB('127.0.0.1/dbname', 'user', 'password', just_try=True)
mocker.patch("cx_Oracle.connect", fake_cxoracle_connect_just_try)
return OracleDB("127.0.0.1/dbname", "user", "password", just_try=True)
@pytest.fixture
@ -114,11 +145,22 @@ def fake_connected_just_try_oracledb(fake_just_try_oracledb):
return fake_just_try_oracledb
def generate_mock_args(expected_args=(), expected_kwargs=dict(), expected_return=True): # pylint: disable=dangerous-default-value
def generate_mock_args(
expected_args=(), expected_kwargs={}, expected_return=True
): # pylint: disable=dangerous-default-value
def mock_args(*args, **kwargs):
assert args == expected_args, "Invalid call args:\n %s\nMay be:\n %s" % (args, expected_args)
assert kwargs == expected_kwargs, "Invalid call kwargs:\n %s\nMay be:\n %s" % (kwargs, expected_kwargs)
# pylint: disable=consider-using-f-string
assert args == expected_args, "Invalid call args:\n %s\nMay be:\n %s" % (
args,
expected_args,
)
# pylint: disable=consider-using-f-string
assert kwargs == expected_kwargs, "Invalid call kwargs:\n %s\nMay be:\n %s" % (
kwargs,
expected_kwargs,
)
return expected_return
return mock_args
@ -126,11 +168,22 @@ def mock_doSQL_just_try(self, sql, params=None): # pylint: disable=unused-argum
assert False, "doSQL() may not be executed in just try mode"
def generate_mock_doSQL(expected_sql, expected_params=dict(), expected_return=True): # pylint: disable=dangerous-default-value
def generate_mock_doSQL(
expected_sql, expected_params={}, expected_return=True
): # pylint: disable=dangerous-default-value
def mock_doSQL(self, sql, params=None): # pylint: disable=unused-argument
assert sql == expected_sql, "Invalid generated SQL query:\n '%s'\nMay be:\n '%s'" % (sql, expected_sql)
assert params == expected_params, "Invalid generated params:\n %s\nMay be:\n %s" % (params, expected_params)
# pylint: disable=consider-using-f-string
assert sql == expected_sql, "Invalid generated SQL query:\n '%s'\nMay be:\n '%s'" % (
sql,
expected_sql,
)
# pylint: disable=consider-using-f-string
assert params == expected_params, "Invalid generated params:\n %s\nMay be:\n %s" % (
params,
expected_params,
)
return expected_return
return mock_doSQL
@ -144,103 +197,94 @@ mock_doSelect_just_try = mock_doSQL_just_try
def test_combine_params_with_to_add_parameter():
assert OracleDB._combine_params(dict(test1=1), dict(test2=2)) == dict(
test1=1, test2=2
)
assert OracleDB._combine_params({"test1": 1}, {"test2": 2}) == {"test1": 1, "test2": 2}
def test_combine_params_with_kargs():
assert OracleDB._combine_params(dict(test1=1), test2=2) == dict(
test1=1, test2=2
)
assert OracleDB._combine_params({"test1": 1}, test2=2) == {"test1": 1, "test2": 2}
def test_combine_params_with_kargs_and_to_add_parameter():
assert OracleDB._combine_params(dict(test1=1), dict(test2=2), test3=3) == dict(
test1=1, test2=2, test3=3
)
assert OracleDB._combine_params({"test1": 1}, {"test2": 2}, test3=3) == {
"test1": 1,
"test2": 2,
"test3": 3,
}
def test_format_where_clauses_params_are_preserved():
args = ('test = test', dict(test1=1))
args = ("test = test", {"test1": 1})
assert OracleDB._format_where_clauses(*args) == args
def test_format_where_clauses_raw():
assert OracleDB._format_where_clauses('test = test') == (('test = test'), dict())
assert OracleDB._format_where_clauses("test = test") == ("test = test", {})
def test_format_where_clauses_tuple_clause_with_params():
where_clauses = (
'test1 = :test1 AND test2 = :test2',
dict(test1=1, test2=2)
)
where_clauses = ("test1 = :test1 AND test2 = :test2", {"test1": 1, "test2": 2})
assert OracleDB._format_where_clauses(where_clauses) == where_clauses
def test_format_where_clauses_dict():
where_clauses = dict(test1=1, test2=2)
where_clauses = {"test1": 1, "test2": 2}
assert OracleDB._format_where_clauses(where_clauses) == (
'"test1" = :test1 AND "test2" = :test2',
where_clauses
where_clauses,
)
def test_format_where_clauses_combined_types():
where_clauses = (
'test1 = 1',
('test2 LIKE :test2', dict(test2=2)),
dict(test3=3, test4=4)
)
where_clauses = ("test1 = 1", ("test2 LIKE :test2", {"test2": 2}), {"test3": 3, "test4": 4})
assert OracleDB._format_where_clauses(where_clauses) == (
'test1 = 1 AND test2 LIKE :test2 AND "test3" = :test3 AND "test4" = :test4',
dict(test2=2, test3=3, test4=4)
{"test2": 2, "test3": 3, "test4": 4},
)
def test_format_where_clauses_with_where_op():
where_clauses = dict(test1=1, test2=2)
assert OracleDB._format_where_clauses(where_clauses, where_op='OR') == (
where_clauses = {"test1": 1, "test2": 2}
assert OracleDB._format_where_clauses(where_clauses, where_op="OR") == (
'"test1" = :test1 OR "test2" = :test2',
where_clauses
where_clauses,
)
def test_add_where_clauses():
sql = "SELECT * FROM table"
where_clauses = dict(test1=1, test2=2)
where_clauses = {"test1": 1, "test2": 2}
assert OracleDB._add_where_clauses(sql, None, where_clauses) == (
sql + ' WHERE "test1" = :test1 AND "test2" = :test2',
where_clauses
where_clauses,
)
def test_add_where_clauses_preserved_params():
sql = "SELECT * FROM table"
where_clauses = dict(test1=1, test2=2)
params = dict(fake1=1)
where_clauses = {"test1": 1, "test2": 2}
params = {"fake1": 1}
assert OracleDB._add_where_clauses(sql, params.copy(), where_clauses) == (
sql + ' WHERE "test1" = :test1 AND "test2" = :test2',
dict(**where_clauses, **params)
{**where_clauses, **params},
)
def test_add_where_clauses_with_op():
sql = "SELECT * FROM table"
where_clauses = ('test1=1', 'test2=2')
assert OracleDB._add_where_clauses(sql, None, where_clauses, where_op='OR') == (
sql + ' WHERE test1=1 OR test2=2',
dict()
where_clauses = ("test1=1", "test2=2")
assert OracleDB._add_where_clauses(sql, None, where_clauses, where_op="OR") == (
sql + " WHERE test1=1 OR test2=2",
{},
)
def test_add_where_clauses_with_duplicated_field():
sql = "UPDATE table SET test1=:test1"
params = dict(test1='new_value')
where_clauses = dict(test1='where_value')
params = {"test1": "new_value"}
where_clauses = {"test1": "where_value"}
assert OracleDB._add_where_clauses(sql, params, where_clauses) == (
sql + ' WHERE "test1" = :test1_1',
dict(test1='new_value', test1_1='where_value')
{"test1": "new_value", "test1_1": "where_value"},
)
@ -250,107 +294,107 @@ def test_quote_table_name():
def test_insert(mocker, test_oracledb):
values = dict(test1=1, test2=2)
values = {"test1": 1, "test2": 2}
mocker.patch(
'mylib.oracle.OracleDB.doSQL',
"mylib.oracle.OracleDB.doSQL",
generate_mock_doSQL(
'INSERT INTO "mytable" ("test1", "test2") VALUES (:test1, :test2)',
values
)
'INSERT INTO "mytable" ("test1", "test2") VALUES (:test1, :test2)', values
),
)
assert test_oracledb.insert('mytable', values)
assert test_oracledb.insert("mytable", values)
def test_insert_just_try(mocker, test_oracledb):
mocker.patch('mylib.oracle.OracleDB.doSQL', mock_doSQL_just_try)
assert test_oracledb.insert('mytable', dict(test1=1, test2=2), just_try=True)
mocker.patch("mylib.oracle.OracleDB.doSQL", mock_doSQL_just_try)
assert test_oracledb.insert("mytable", {"test1": 1, "test2": 2}, just_try=True)
def test_update(mocker, test_oracledb):
values = dict(test1=1, test2=2)
where_clauses = dict(test3=3, test4=4)
values = {"test1": 1, "test2": 2}
where_clauses = {"test3": 3, "test4": 4}
mocker.patch(
'mylib.oracle.OracleDB.doSQL',
"mylib.oracle.OracleDB.doSQL",
generate_mock_doSQL(
'UPDATE "mytable" SET "test1" = :test1, "test2" = :test2 WHERE "test3" = :test3 AND "test4" = :test4',
dict(**values, **where_clauses)
)
'UPDATE "mytable" SET "test1" = :test1, "test2" = :test2 WHERE "test3" = :test3 AND'
' "test4" = :test4',
{**values, **where_clauses},
),
)
assert test_oracledb.update('mytable', values, where_clauses)
assert test_oracledb.update("mytable", values, where_clauses)
def test_update_just_try(mocker, test_oracledb):
mocker.patch('mylib.oracle.OracleDB.doSQL', mock_doSQL_just_try)
assert test_oracledb.update('mytable', dict(test1=1, test2=2), None, just_try=True)
mocker.patch("mylib.oracle.OracleDB.doSQL", mock_doSQL_just_try)
assert test_oracledb.update("mytable", {"test1": 1, "test2": 2}, None, just_try=True)
def test_delete(mocker, test_oracledb):
where_clauses = dict(test1=1, test2=2)
where_clauses = {"test1": 1, "test2": 2}
mocker.patch(
'mylib.oracle.OracleDB.doSQL',
"mylib.oracle.OracleDB.doSQL",
generate_mock_doSQL(
'DELETE FROM "mytable" WHERE "test1" = :test1 AND "test2" = :test2',
where_clauses
)
'DELETE FROM "mytable" WHERE "test1" = :test1 AND "test2" = :test2', where_clauses
),
)
assert test_oracledb.delete('mytable', where_clauses)
assert test_oracledb.delete("mytable", where_clauses)
def test_delete_just_try(mocker, test_oracledb):
mocker.patch('mylib.oracle.OracleDB.doSQL', mock_doSQL_just_try)
assert test_oracledb.delete('mytable', None, just_try=True)
mocker.patch("mylib.oracle.OracleDB.doSQL", mock_doSQL_just_try)
assert test_oracledb.delete("mytable", None, just_try=True)
def test_truncate(mocker, test_oracledb):
mocker.patch(
'mylib.oracle.OracleDB.doSQL',
generate_mock_doSQL('TRUNCATE TABLE "mytable"', None)
"mylib.oracle.OracleDB.doSQL", generate_mock_doSQL('TRUNCATE TABLE "mytable"', None)
)
assert test_oracledb.truncate('mytable')
assert test_oracledb.truncate("mytable")
def test_truncate_just_try(mocker, test_oracledb):
mocker.patch('mylib.oracle.OracleDB.doSQL', mock_doSelect_just_try)
assert test_oracledb.truncate('mytable', just_try=True)
mocker.patch("mylib.oracle.OracleDB.doSQL", mock_doSelect_just_try)
assert test_oracledb.truncate("mytable", just_try=True)
def test_select(mocker, test_oracledb):
fields = ('field1', 'field2')
where_clauses = dict(test3=3, test4=4)
fields = ("field1", "field2")
where_clauses = {"test3": 3, "test4": 4}
expected_return = [
dict(field1=1, field2=2),
dict(field1=2, field2=3),
{"field1": 1, "field2": 2},
{"field1": 2, "field2": 3},
]
order_by = "field1, DESC"
limit = 10
mocker.patch(
'mylib.oracle.OracleDB.doSelect',
"mylib.oracle.OracleDB.doSelect",
generate_mock_doSQL(
'SELECT "field1", "field2" FROM "mytable" WHERE "test3" = :test3 AND "test4" = :test4 ORDER BY ' + order_by,
where_clauses, expected_return
)
'SELECT "field1", "field2" FROM "mytable" WHERE "test3" = :test3 AND "test4" = :test4'
" ORDER BY " + order_by + " LIMIT " + str(limit), # nosec: B608
where_clauses,
expected_return,
),
)
assert test_oracledb.select('mytable', where_clauses, fields, order_by=order_by) == expected_return
assert (
test_oracledb.select("mytable", where_clauses, fields, order_by=order_by, limit=limit)
== expected_return
)
def test_select_without_field_and_order_by(mocker, test_oracledb):
mocker.patch(
'mylib.oracle.OracleDB.doSelect',
generate_mock_doSQL(
'SELECT * FROM "mytable"'
)
)
mocker.patch("mylib.oracle.OracleDB.doSelect", generate_mock_doSQL('SELECT * FROM "mytable"'))
assert test_oracledb.select('mytable')
assert test_oracledb.select("mytable")
def test_select_just_try(mocker, test_oracledb):
mocker.patch('mylib.oracle.OracleDB.doSQL', mock_doSelect_just_try)
assert test_oracledb.select('mytable', None, None, just_try=True)
mocker.patch("mylib.oracle.OracleDB.doSQL", mock_doSelect_just_try)
assert test_oracledb.select("mytable", None, None, just_try=True)
#
# Tests on main methods
@ -358,18 +402,13 @@ def test_select_just_try(mocker, test_oracledb):
def test_connect(mocker, test_oracledb):
expected_kwargs = dict(
dsn=test_oracledb._dsn,
user=test_oracledb._user,
password=test_oracledb._pwd
)
expected_kwargs = {
"dsn": test_oracledb._dsn,
"user": test_oracledb._user,
"password": test_oracledb._pwd,
}
mocker.patch(
'cx_Oracle.connect',
generate_mock_args(
expected_kwargs=expected_kwargs
)
)
mocker.patch("cx_Oracle.connect", generate_mock_args(expected_kwargs=expected_kwargs))
assert test_oracledb.connect()
@ -383,50 +422,62 @@ def test_close_connected(fake_connected_oracledb):
def test_doSQL(fake_connected_oracledb):
fake_connected_oracledb._conn.expected_sql = 'DELETE FROM table WHERE test1 = :test1'
fake_connected_oracledb._conn.expected_params = dict(test1=1)
fake_connected_oracledb.doSQL(fake_connected_oracledb._conn.expected_sql, fake_connected_oracledb._conn.expected_params)
fake_connected_oracledb._conn.expected_sql = "DELETE FROM table WHERE test1 = :test1"
fake_connected_oracledb._conn.expected_params = {"test1": 1}
fake_connected_oracledb.doSQL(
fake_connected_oracledb._conn.expected_sql, fake_connected_oracledb._conn.expected_params
)
def test_doSQL_without_params(fake_connected_oracledb):
fake_connected_oracledb._conn.expected_sql = 'DELETE FROM table'
fake_connected_oracledb._conn.expected_sql = "DELETE FROM table"
fake_connected_oracledb.doSQL(fake_connected_oracledb._conn.expected_sql)
def test_doSQL_just_try(fake_connected_just_try_oracledb):
assert fake_connected_just_try_oracledb.doSQL('DELETE FROM table')
assert fake_connected_just_try_oracledb.doSQL("DELETE FROM table")
def test_doSQL_on_exception(fake_connected_oracledb):
fake_connected_oracledb._conn.expected_exception = True
assert fake_connected_oracledb.doSQL('DELETE FROM table') is False
assert fake_connected_oracledb.doSQL("DELETE FROM table") is False
def test_doSelect(fake_connected_oracledb):
fake_connected_oracledb._conn.expected_sql = 'SELECT * FROM table WHERE test1 = :test1'
fake_connected_oracledb._conn.expected_params = dict(test1=1)
fake_connected_oracledb._conn.expected_return = [dict(test1=1)]
assert fake_connected_oracledb.doSelect(
fake_connected_oracledb._conn.expected_sql,
fake_connected_oracledb._conn.expected_params) == fake_connected_oracledb._conn.expected_return
fake_connected_oracledb._conn.expected_sql = "SELECT * FROM table WHERE test1 = :test1"
fake_connected_oracledb._conn.expected_params = {"test1": 1}
fake_connected_oracledb._conn.expected_return = [{"test1": 1}]
assert (
fake_connected_oracledb.doSelect(
fake_connected_oracledb._conn.expected_sql,
fake_connected_oracledb._conn.expected_params,
)
== fake_connected_oracledb._conn.expected_return
)
def test_doSelect_without_params(fake_connected_oracledb):
fake_connected_oracledb._conn.expected_sql = 'SELECT * FROM table'
fake_connected_oracledb._conn.expected_return = [dict(test1=1)]
assert fake_connected_oracledb.doSelect(fake_connected_oracledb._conn.expected_sql) == fake_connected_oracledb._conn.expected_return
fake_connected_oracledb._conn.expected_sql = "SELECT * FROM table"
fake_connected_oracledb._conn.expected_return = [{"test1": 1}]
assert (
fake_connected_oracledb.doSelect(fake_connected_oracledb._conn.expected_sql)
== fake_connected_oracledb._conn.expected_return
)
def test_doSelect_on_exception(fake_connected_oracledb):
fake_connected_oracledb._conn.expected_exception = True
assert fake_connected_oracledb.doSelect('SELECT * FROM table') is False
assert fake_connected_oracledb.doSelect("SELECT * FROM table") is False
def test_doSelect_just_try(fake_connected_just_try_oracledb):
fake_connected_just_try_oracledb._conn.expected_sql = 'SELECT * FROM table WHERE test1 = :test1'
fake_connected_just_try_oracledb._conn.expected_params = dict(test1=1)
fake_connected_just_try_oracledb._conn.expected_return = [dict(test1=1)]
assert fake_connected_just_try_oracledb.doSelect(
fake_connected_just_try_oracledb._conn.expected_sql,
fake_connected_just_try_oracledb._conn.expected_params
) == fake_connected_just_try_oracledb._conn.expected_return
fake_connected_just_try_oracledb._conn.expected_sql = "SELECT * FROM table WHERE test1 = :test1"
fake_connected_just_try_oracledb._conn.expected_params = {"test1": 1}
fake_connected_just_try_oracledb._conn.expected_return = [{"test1": 1}]
assert (
fake_connected_just_try_oracledb.doSelect(
fake_connected_just_try_oracledb._conn.expected_sql,
fake_connected_just_try_oracledb._conn.expected_params,
)
== fake_connected_just_try_oracledb._conn.expected_return
)

View file

@ -1,15 +1,19 @@
# pylint: disable=redefined-outer-name,missing-function-docstring,protected-access
""" Tests on opening hours helpers """
import psycopg2
import pytest
from psycopg2.extras import RealDictCursor
from mylib.pgsql import PgDB
class FakePsycopg2Cursor:
""" Fake Psycopg2 cursor """
"""Fake Psycopg2 cursor"""
def __init__(self, expected_sql, expected_params, expected_return, expected_just_try, expected_exception):
def __init__(
self, expected_sql, expected_params, expected_return, expected_just_try, expected_exception
):
self.expected_sql = expected_sql
self.expected_params = expected_params
self.expected_return = expected_return
@ -18,38 +22,55 @@ class FakePsycopg2Cursor:
def execute(self, sql, params=None):
if self.expected_exception:
raise Exception("%s.execute(%s, %s): expected exception" % (self, sql, params))
if self.expected_just_try and not sql.lower().startswith('select '):
assert False, "%s.execute(%s, %s) may not be executed in just try mode" % (self, sql, params)
assert sql == self.expected_sql, "%s.execute(): Invalid SQL query:\n '%s'\nMay be:\n '%s'" % (self, sql, self.expected_sql)
assert params == self.expected_params, "%s.execute(): Invalid params:\n %s\nMay be:\n %s" % (self, params, self.expected_params)
raise psycopg2.Error(f"{self}.execute({sql}, {params}): expected exception")
if self.expected_just_try and not sql.lower().startswith("select "):
assert False, f"{self}.execute({sql}, {params}) may not be executed in just try mode"
# pylint: disable=consider-using-f-string
assert (
sql == self.expected_sql
), "%s.execute(): Invalid SQL query:\n '%s'\nMay be:\n '%s'" % (
self,
sql,
self.expected_sql,
)
# pylint: disable=consider-using-f-string
assert (
params == self.expected_params
), "%s.execute(): Invalid params:\n %s\nMay be:\n %s" % (
self,
params,
self.expected_params,
)
return self.expected_return
def fetchall(self):
return self.expected_return
def __repr__(self):
return "FakePsycopg2Cursor(%s, %s, %s, %s)" % (
self.expected_sql, self.expected_params,
self.expected_return, self.expected_just_try
return (
f"FakePsycopg2Cursor({self.expected_sql}, {self.expected_params}, "
f"{self.expected_return}, {self.expected_just_try})"
)
class FakePsycopg2:
""" Fake Psycopg2 connection """
"""Fake Psycopg2 connection"""
expected_sql = None
expected_params = None
expected_cursor_factory = None
expected_return = True
expected_just_try = False
expected_exception = False
just_try = False
def __init__(self, **kwargs):
allowed_kwargs = dict(dbname=str, user=str, password=(str, None), host=str)
allowed_kwargs = {"dbname": str, "user": str, "password": (str, None), "host": str}
for arg, value in kwargs.items():
assert arg in allowed_kwargs, "Invalid arg %s='%s'" % (arg, value)
assert isinstance(value, allowed_kwargs[arg]), "Arg %s not a %s (%s)" % (arg, allowed_kwargs[arg], type(value))
assert arg in allowed_kwargs, f'Invalid arg {arg}="{value}"'
assert isinstance(
value, allowed_kwargs[arg]
), f"Arg {arg} not a {allowed_kwargs[arg]} ({type(value)})"
setattr(self, arg, value)
def close(self):
@ -59,14 +80,17 @@ class FakePsycopg2:
self._check_just_try()
assert len(arg) == 1 and isinstance(arg[0], str)
if self.expected_exception:
raise Exception("set_client_encoding(%s): Expected exception" % arg[0])
raise psycopg2.Error(f"set_client_encoding({arg[0]}): Expected exception")
return self.expected_return
def cursor(self):
def cursor(self, cursor_factory=None):
assert cursor_factory is self.expected_cursor_factory
return FakePsycopg2Cursor(
self.expected_sql, self.expected_params,
self.expected_return, self.expected_just_try or self.just_try,
self.expected_exception
self.expected_sql,
self.expected_params,
self.expected_return,
self.expected_just_try or self.just_try,
self.expected_exception,
)
def commit(self):
@ -94,19 +118,19 @@ def fake_psycopg2_connect_just_try(**kwargs):
@pytest.fixture
def test_pgdb():
return PgDB('127.0.0.1', 'user', 'password', 'dbname')
return PgDB("127.0.0.1", "user", "password", "dbname")
@pytest.fixture
def fake_pgdb(mocker):
mocker.patch('psycopg2.connect', fake_psycopg2_connect)
return PgDB('127.0.0.1', 'user', 'password', 'dbname')
mocker.patch("psycopg2.connect", fake_psycopg2_connect)
return PgDB("127.0.0.1", "user", "password", "dbname")
@pytest.fixture
def fake_just_try_pgdb(mocker):
mocker.patch('psycopg2.connect', fake_psycopg2_connect_just_try)
return PgDB('127.0.0.1', 'user', 'password', 'dbname', just_try=True)
mocker.patch("psycopg2.connect", fake_psycopg2_connect_just_try)
return PgDB("127.0.0.1", "user", "password", "dbname", just_try=True)
@pytest.fixture
@ -121,11 +145,22 @@ def fake_connected_just_try_pgdb(fake_just_try_pgdb):
return fake_just_try_pgdb
def generate_mock_args(expected_args=(), expected_kwargs=dict(), expected_return=True): # pylint: disable=dangerous-default-value
def generate_mock_args(
expected_args=(), expected_kwargs={}, expected_return=True
): # pylint: disable=dangerous-default-value
def mock_args(*args, **kwargs):
assert args == expected_args, "Invalid call args:\n %s\nMay be:\n %s" % (args, expected_args)
assert kwargs == expected_kwargs, "Invalid call kwargs:\n %s\nMay be:\n %s" % (kwargs, expected_kwargs)
# pylint: disable=consider-using-f-string
assert args == expected_args, "Invalid call args:\n %s\nMay be:\n %s" % (
args,
expected_args,
)
# pylint: disable=consider-using-f-string
assert kwargs == expected_kwargs, "Invalid call kwargs:\n %s\nMay be:\n %s" % (
kwargs,
expected_kwargs,
)
return expected_return
return mock_args
@ -133,11 +168,22 @@ def mock_doSQL_just_try(self, sql, params=None): # pylint: disable=unused-argum
assert False, "doSQL() may not be executed in just try mode"
def generate_mock_doSQL(expected_sql, expected_params=dict(), expected_return=True): # pylint: disable=dangerous-default-value
def generate_mock_doSQL(
expected_sql, expected_params={}, expected_return=True
): # pylint: disable=dangerous-default-value
def mock_doSQL(self, sql, params=None): # pylint: disable=unused-argument
assert sql == expected_sql, "Invalid generated SQL query:\n '%s'\nMay be:\n '%s'" % (sql, expected_sql)
assert params == expected_params, "Invalid generated params:\n %s\nMay be:\n %s" % (params, expected_params)
# pylint: disable=consider-using-f-string
assert sql == expected_sql, "Invalid generated SQL query:\n '%s'\nMay be:\n '%s'" % (
sql,
expected_sql,
)
# pylint: disable=consider-using-f-string
assert params == expected_params, "Invalid generated params:\n %s\nMay be:\n %s" % (
params,
expected_params,
)
return expected_return
return mock_doSQL
@ -151,103 +197,94 @@ mock_doSelect_just_try = mock_doSQL_just_try
def test_combine_params_with_to_add_parameter():
assert PgDB._combine_params(dict(test1=1), dict(test2=2)) == dict(
test1=1, test2=2
)
assert PgDB._combine_params({"test1": 1}, {"test2": 2}) == {"test1": 1, "test2": 2}
def test_combine_params_with_kargs():
assert PgDB._combine_params(dict(test1=1), test2=2) == dict(
test1=1, test2=2
)
assert PgDB._combine_params({"test1": 1}, test2=2) == {"test1": 1, "test2": 2}
def test_combine_params_with_kargs_and_to_add_parameter():
assert PgDB._combine_params(dict(test1=1), dict(test2=2), test3=3) == dict(
test1=1, test2=2, test3=3
)
assert PgDB._combine_params({"test1": 1}, {"test2": 2}, test3=3) == {
"test1": 1,
"test2": 2,
"test3": 3,
}
def test_format_where_clauses_params_are_preserved():
args = ('test = test', dict(test1=1))
args = ("test = test", {"test1": 1})
assert PgDB._format_where_clauses(*args) == args
def test_format_where_clauses_raw():
assert PgDB._format_where_clauses('test = test') == (('test = test'), dict())
assert PgDB._format_where_clauses("test = test") == ("test = test", {})
def test_format_where_clauses_tuple_clause_with_params():
where_clauses = (
'test1 = %(test1)s AND test2 = %(test2)s',
dict(test1=1, test2=2)
)
where_clauses = ("test1 = %(test1)s AND test2 = %(test2)s", {"test1": 1, "test2": 2})
assert PgDB._format_where_clauses(where_clauses) == where_clauses
def test_format_where_clauses_dict():
where_clauses = dict(test1=1, test2=2)
where_clauses = {"test1": 1, "test2": 2}
assert PgDB._format_where_clauses(where_clauses) == (
'"test1" = %(test1)s AND "test2" = %(test2)s',
where_clauses
where_clauses,
)
def test_format_where_clauses_combined_types():
where_clauses = (
'test1 = 1',
('test2 LIKE %(test2)s', dict(test2=2)),
dict(test3=3, test4=4)
)
where_clauses = ("test1 = 1", ("test2 LIKE %(test2)s", {"test2": 2}), {"test3": 3, "test4": 4})
assert PgDB._format_where_clauses(where_clauses) == (
'test1 = 1 AND test2 LIKE %(test2)s AND "test3" = %(test3)s AND "test4" = %(test4)s',
dict(test2=2, test3=3, test4=4)
{"test2": 2, "test3": 3, "test4": 4},
)
def test_format_where_clauses_with_where_op():
where_clauses = dict(test1=1, test2=2)
assert PgDB._format_where_clauses(where_clauses, where_op='OR') == (
where_clauses = {"test1": 1, "test2": 2}
assert PgDB._format_where_clauses(where_clauses, where_op="OR") == (
'"test1" = %(test1)s OR "test2" = %(test2)s',
where_clauses
where_clauses,
)
def test_add_where_clauses():
sql = "SELECT * FROM table"
where_clauses = dict(test1=1, test2=2)
where_clauses = {"test1": 1, "test2": 2}
assert PgDB._add_where_clauses(sql, None, where_clauses) == (
sql + ' WHERE "test1" = %(test1)s AND "test2" = %(test2)s',
where_clauses
where_clauses,
)
def test_add_where_clauses_preserved_params():
sql = "SELECT * FROM table"
where_clauses = dict(test1=1, test2=2)
params = dict(fake1=1)
where_clauses = {"test1": 1, "test2": 2}
params = {"fake1": 1}
assert PgDB._add_where_clauses(sql, params.copy(), where_clauses) == (
sql + ' WHERE "test1" = %(test1)s AND "test2" = %(test2)s',
dict(**where_clauses, **params)
{**where_clauses, **params},
)
def test_add_where_clauses_with_op():
sql = "SELECT * FROM table"
where_clauses = ('test1=1', 'test2=2')
assert PgDB._add_where_clauses(sql, None, where_clauses, where_op='OR') == (
sql + ' WHERE test1=1 OR test2=2',
dict()
where_clauses = ("test1=1", "test2=2")
assert PgDB._add_where_clauses(sql, None, where_clauses, where_op="OR") == (
sql + " WHERE test1=1 OR test2=2",
{},
)
def test_add_where_clauses_with_duplicated_field():
sql = "UPDATE table SET test1=%(test1)s"
params = dict(test1='new_value')
where_clauses = dict(test1='where_value')
params = {"test1": "new_value"}
where_clauses = {"test1": "where_value"}
assert PgDB._add_where_clauses(sql, params, where_clauses) == (
sql + ' WHERE "test1" = %(test1_1)s',
dict(test1='new_value', test1_1='where_value')
{"test1": "new_value", "test1_1": "where_value"},
)
@ -257,107 +294,105 @@ def test_quote_table_name():
def test_insert(mocker, test_pgdb):
values = dict(test1=1, test2=2)
values = {"test1": 1, "test2": 2}
mocker.patch(
'mylib.pgsql.PgDB.doSQL',
"mylib.pgsql.PgDB.doSQL",
generate_mock_doSQL(
'INSERT INTO "mytable" ("test1", "test2") VALUES (%(test1)s, %(test2)s)',
values
)
'INSERT INTO "mytable" ("test1", "test2") VALUES (%(test1)s, %(test2)s)', values
),
)
assert test_pgdb.insert('mytable', values)
assert test_pgdb.insert("mytable", values)
def test_insert_just_try(mocker, test_pgdb):
mocker.patch('mylib.pgsql.PgDB.doSQL', mock_doSQL_just_try)
assert test_pgdb.insert('mytable', dict(test1=1, test2=2), just_try=True)
mocker.patch("mylib.pgsql.PgDB.doSQL", mock_doSQL_just_try)
assert test_pgdb.insert("mytable", {"test1": 1, "test2": 2}, just_try=True)
def test_update(mocker, test_pgdb):
values = dict(test1=1, test2=2)
where_clauses = dict(test3=3, test4=4)
values = {"test1": 1, "test2": 2}
where_clauses = {"test3": 3, "test4": 4}
mocker.patch(
'mylib.pgsql.PgDB.doSQL',
"mylib.pgsql.PgDB.doSQL",
generate_mock_doSQL(
'UPDATE "mytable" SET "test1" = %(test1)s, "test2" = %(test2)s WHERE "test3" = %(test3)s AND "test4" = %(test4)s',
dict(**values, **where_clauses)
)
'UPDATE "mytable" SET "test1" = %(test1)s, "test2" = %(test2)s WHERE "test3" ='
' %(test3)s AND "test4" = %(test4)s',
{**values, **where_clauses},
),
)
assert test_pgdb.update('mytable', values, where_clauses)
assert test_pgdb.update("mytable", values, where_clauses)
def test_update_just_try(mocker, test_pgdb):
mocker.patch('mylib.pgsql.PgDB.doSQL', mock_doSQL_just_try)
assert test_pgdb.update('mytable', dict(test1=1, test2=2), None, just_try=True)
mocker.patch("mylib.pgsql.PgDB.doSQL", mock_doSQL_just_try)
assert test_pgdb.update("mytable", {"test1": 1, "test2": 2}, None, just_try=True)
def test_delete(mocker, test_pgdb):
where_clauses = dict(test1=1, test2=2)
where_clauses = {"test1": 1, "test2": 2}
mocker.patch(
'mylib.pgsql.PgDB.doSQL',
"mylib.pgsql.PgDB.doSQL",
generate_mock_doSQL(
'DELETE FROM "mytable" WHERE "test1" = %(test1)s AND "test2" = %(test2)s',
where_clauses
)
'DELETE FROM "mytable" WHERE "test1" = %(test1)s AND "test2" = %(test2)s', where_clauses
),
)
assert test_pgdb.delete('mytable', where_clauses)
assert test_pgdb.delete("mytable", where_clauses)
def test_delete_just_try(mocker, test_pgdb):
mocker.patch('mylib.pgsql.PgDB.doSQL', mock_doSQL_just_try)
assert test_pgdb.delete('mytable', None, just_try=True)
mocker.patch("mylib.pgsql.PgDB.doSQL", mock_doSQL_just_try)
assert test_pgdb.delete("mytable", None, just_try=True)
def test_truncate(mocker, test_pgdb):
mocker.patch(
'mylib.pgsql.PgDB.doSQL',
generate_mock_doSQL('TRUNCATE "mytable"', None)
)
mocker.patch("mylib.pgsql.PgDB.doSQL", generate_mock_doSQL('TRUNCATE TABLE "mytable"', None))
assert test_pgdb.truncate('mytable')
assert test_pgdb.truncate("mytable")
def test_truncate_just_try(mocker, test_pgdb):
mocker.patch('mylib.pgsql.PgDB.doSQL', mock_doSelect_just_try)
assert test_pgdb.truncate('mytable', just_try=True)
mocker.patch("mylib.pgsql.PgDB.doSQL", mock_doSelect_just_try)
assert test_pgdb.truncate("mytable", just_try=True)
def test_select(mocker, test_pgdb):
fields = ('field1', 'field2')
where_clauses = dict(test3=3, test4=4)
fields = ("field1", "field2")
where_clauses = {"test3": 3, "test4": 4}
expected_return = [
dict(field1=1, field2=2),
dict(field1=2, field2=3),
{"field1": 1, "field2": 2},
{"field1": 2, "field2": 3},
]
order_by = "field1, DESC"
limit = 10
mocker.patch(
'mylib.pgsql.PgDB.doSelect',
"mylib.pgsql.PgDB.doSelect",
generate_mock_doSQL(
'SELECT "field1", "field2" FROM "mytable" WHERE "test3" = %(test3)s AND "test4" = %(test4)s ORDER BY ' + order_by,
where_clauses, expected_return
)
'SELECT "field1", "field2" FROM "mytable" WHERE "test3" = %(test3)s AND "test4" ='
" %(test4)s ORDER BY " + order_by + " LIMIT " + str(limit), # nosec: B608
where_clauses,
expected_return,
),
)
assert test_pgdb.select('mytable', where_clauses, fields, order_by=order_by) == expected_return
assert (
test_pgdb.select("mytable", where_clauses, fields, order_by=order_by, limit=limit)
== expected_return
)
def test_select_without_field_and_order_by(mocker, test_pgdb):
mocker.patch(
'mylib.pgsql.PgDB.doSelect',
generate_mock_doSQL(
'SELECT * FROM "mytable"'
)
)
mocker.patch("mylib.pgsql.PgDB.doSelect", generate_mock_doSQL('SELECT * FROM "mytable"'))
assert test_pgdb.select('mytable')
assert test_pgdb.select("mytable")
def test_select_just_try(mocker, test_pgdb):
mocker.patch('mylib.pgsql.PgDB.doSQL', mock_doSelect_just_try)
assert test_pgdb.select('mytable', None, None, just_try=True)
mocker.patch("mylib.pgsql.PgDB.doSQL", mock_doSelect_just_try)
assert test_pgdb.select("mytable", None, None, just_try=True)
#
# Tests on main methods
@ -365,19 +400,14 @@ def test_select_just_try(mocker, test_pgdb):
def test_connect(mocker, test_pgdb):
expected_kwargs = dict(
dbname=test_pgdb._db,
user=test_pgdb._user,
host=test_pgdb._host,
password=test_pgdb._pwd
)
expected_kwargs = {
"dbname": test_pgdb._db,
"user": test_pgdb._user,
"host": test_pgdb._host,
"password": test_pgdb._pwd,
}
mocker.patch(
'psycopg2.connect',
generate_mock_args(
expected_kwargs=expected_kwargs
)
)
mocker.patch("psycopg2.connect", generate_mock_args(expected_kwargs=expected_kwargs))
assert test_pgdb.connect()
@ -391,61 +421,78 @@ def test_close_connected(fake_connected_pgdb):
def test_setEncoding(fake_connected_pgdb):
assert fake_connected_pgdb.setEncoding('utf8')
assert fake_connected_pgdb.setEncoding("utf8")
def test_setEncoding_not_connected(fake_pgdb):
assert fake_pgdb.setEncoding('utf8') is False
assert fake_pgdb.setEncoding("utf8") is False
def test_setEncoding_on_exception(fake_connected_pgdb):
fake_connected_pgdb._conn.expected_exception = True
assert fake_connected_pgdb.setEncoding('utf8') is False
assert fake_connected_pgdb.setEncoding("utf8") is False
def test_doSQL(fake_connected_pgdb):
fake_connected_pgdb._conn.expected_sql = 'DELETE FROM table WHERE test1 = %(test1)s'
fake_connected_pgdb._conn.expected_params = dict(test1=1)
fake_connected_pgdb.doSQL(fake_connected_pgdb._conn.expected_sql, fake_connected_pgdb._conn.expected_params)
fake_connected_pgdb._conn.expected_sql = "DELETE FROM table WHERE test1 = %(test1)s"
fake_connected_pgdb._conn.expected_params = {"test1": 1}
fake_connected_pgdb.doSQL(
fake_connected_pgdb._conn.expected_sql, fake_connected_pgdb._conn.expected_params
)
def test_doSQL_without_params(fake_connected_pgdb):
fake_connected_pgdb._conn.expected_sql = 'DELETE FROM table'
fake_connected_pgdb._conn.expected_sql = "DELETE FROM table"
fake_connected_pgdb.doSQL(fake_connected_pgdb._conn.expected_sql)
def test_doSQL_just_try(fake_connected_just_try_pgdb):
assert fake_connected_just_try_pgdb.doSQL('DELETE FROM table')
assert fake_connected_just_try_pgdb.doSQL("DELETE FROM table")
def test_doSQL_on_exception(fake_connected_pgdb):
fake_connected_pgdb._conn.expected_exception = True
assert fake_connected_pgdb.doSQL('DELETE FROM table') is False
assert fake_connected_pgdb.doSQL("DELETE FROM table") is False
def test_doSelect(fake_connected_pgdb):
fake_connected_pgdb._conn.expected_sql = 'SELECT * FROM table WHERE test1 = %(test1)s'
fake_connected_pgdb._conn.expected_params = dict(test1=1)
fake_connected_pgdb._conn.expected_return = [dict(test1=1)]
assert fake_connected_pgdb.doSelect(fake_connected_pgdb._conn.expected_sql, fake_connected_pgdb._conn.expected_params) == fake_connected_pgdb._conn.expected_return
fake_connected_pgdb._conn.expected_sql = "SELECT * FROM table WHERE test1 = %(test1)s"
fake_connected_pgdb._conn.expected_params = {"test1": 1}
fake_connected_pgdb._conn.expected_cursor_factory = RealDictCursor
fake_connected_pgdb._conn.expected_return = [{"test1": 1}]
assert (
fake_connected_pgdb.doSelect(
fake_connected_pgdb._conn.expected_sql, fake_connected_pgdb._conn.expected_params
)
== fake_connected_pgdb._conn.expected_return
)
def test_doSelect_without_params(fake_connected_pgdb):
fake_connected_pgdb._conn.expected_sql = 'SELECT * FROM table'
fake_connected_pgdb._conn.expected_return = [dict(test1=1)]
assert fake_connected_pgdb.doSelect(fake_connected_pgdb._conn.expected_sql) == fake_connected_pgdb._conn.expected_return
fake_connected_pgdb._conn.expected_sql = "SELECT * FROM table"
fake_connected_pgdb._conn.expected_cursor_factory = RealDictCursor
fake_connected_pgdb._conn.expected_return = [{"test1": 1}]
assert (
fake_connected_pgdb.doSelect(fake_connected_pgdb._conn.expected_sql)
== fake_connected_pgdb._conn.expected_return
)
def test_doSelect_on_exception(fake_connected_pgdb):
fake_connected_pgdb._conn.expected_cursor_factory = RealDictCursor
fake_connected_pgdb._conn.expected_exception = True
assert fake_connected_pgdb.doSelect('SELECT * FROM table') is False
assert fake_connected_pgdb.doSelect("SELECT * FROM table") is False
def test_doSelect_just_try(fake_connected_just_try_pgdb):
fake_connected_just_try_pgdb._conn.expected_sql = 'SELECT * FROM table WHERE test1 = %(test1)s'
fake_connected_just_try_pgdb._conn.expected_params = dict(test1=1)
fake_connected_just_try_pgdb._conn.expected_return = [dict(test1=1)]
assert fake_connected_just_try_pgdb.doSelect(
fake_connected_just_try_pgdb._conn.expected_sql,
fake_connected_just_try_pgdb._conn.expected_params
) == fake_connected_just_try_pgdb._conn.expected_return
fake_connected_just_try_pgdb._conn.expected_sql = "SELECT * FROM table WHERE test1 = %(test1)s"
fake_connected_just_try_pgdb._conn.expected_params = {"test1": 1}
fake_connected_just_try_pgdb._conn.expected_cursor_factory = RealDictCursor
fake_connected_just_try_pgdb._conn.expected_return = [{"test1": 1}]
assert (
fake_connected_just_try_pgdb.doSelect(
fake_connected_just_try_pgdb._conn.expected_sql,
fake_connected_just_try_pgdb._conn.expected_params,
)
== fake_connected_just_try_pgdb._conn.expected_return
)

View file

@ -3,13 +3,14 @@
import datetime
import os
import pytest
from mylib.telltale import TelltaleFile
def test_create_telltale_file(tmp_path):
filename = 'test'
filename = "test"
file = TelltaleFile(filename=filename, dirpath=tmp_path)
assert file.filename == filename
assert file.dirpath == tmp_path
@ -24,15 +25,15 @@ def test_create_telltale_file(tmp_path):
def test_create_telltale_file_with_filepath_and_invalid_dirpath():
with pytest.raises(AssertionError):
TelltaleFile(filepath='/tmp/test', dirpath='/var/tmp')
TelltaleFile(filepath="/tmp/test", dirpath="/var/tmp") # nosec: B108
def test_create_telltale_file_with_filepath_and_invalid_filename():
with pytest.raises(AssertionError):
TelltaleFile(filepath='/tmp/test', filename='other')
TelltaleFile(filepath="/tmp/test", filename="other") # nosec: B108
def test_remove_telltale_file(tmp_path):
file = TelltaleFile(filename='test', dirpath=tmp_path)
file = TelltaleFile(filename="test", dirpath=tmp_path)
file.update()
assert file.remove()