diff options
Diffstat (limited to 'programming/python')
| -rw-r--r-- | programming/python/creating_nice_python_cli_tools.md | 40 | ||||
| -rw-r--r-- | programming/python/dependency_handling.md | 116 | ||||
| -rw-r--r-- | programming/python/project_setup.md | 117 | ||||
| -rw-r--r-- | programming/python/python_modules_primer.md | 269 | ||||
| -rw-r--r-- | programming/python/scraping_with_selenium_on_docker.md | 7 |
5 files changed, 549 insertions, 0 deletions
diff --git a/programming/python/creating_nice_python_cli_tools.md b/programming/python/creating_nice_python_cli_tools.md new file mode 100644 index 00000000..53b3f51b --- /dev/null +++ b/programming/python/creating_nice_python_cli_tools.md @@ -0,0 +1,40 @@ +Following this advice can make your tools easy to install by others, pleasant to use, robust, cross-platform, and powerful. + +* Use [my suggestions for setting up Python projects](project_setup.md), particularly: + * Provide instructions for installing your tool using [pipx](https://github.com/pypa/pipx). + Using pipx, people can install and upgrade your script using a simple command that requires no administrative privileges (but it requires having Python and pipx installed). + * As you are using [poetry](https://python-poetry.org/), following the indications above: + * Use [Poetry's support for specifying scripts](https://python-poetry.org/docs/pyproject/#scripts), so when installing your tool via pipx or other means, your scripts are added to the user's path. + * Dependencies you define will be installed automatically along with your application. + This reduces the effort users need to use your application if you need third-party libraries. + However, I would still advise to avoid unnecessary dependencies (for simple HTTP requests you can use the base library. If you do complex requests, then using a third-party library might be much simpler). + As you are using pipx, those dependencies will be installed to a isolated virtualenv, so they will not interfere with anything on your system. + * As your application is properly packaged, you can split your code into different Python files and use imports without issues. +* If your application requires secrets, such as credentials or others, consider using: + * The standard [getpass](https://docs.python.org/3/library/getpass.html) module. + This prompts for a string on the command line, hiding what the user types. + * The [keyring](https://pypi.org/project/keyring/) library. + This stores secrets using your operating system facilities. +* Use the [appdirs](https://pypi.org/project/appdirs/) library to obtain "user paths", such as the users directory for configuration, cache, or data. + appdirs knows the proper paths for Linux, macOS and Windows. + So for example, if your tool caches files and uses appdirs to find the cache directory, you might gain benefits such as cache files being excluded from backups. +* If your tool requires significant time to complete a process: + * Use the [tqdm](https://tqdm.github.io/) library to add a progress bar. + * But also consider using the standard [concurrent.futures](https://docs.python.org/3/library/concurrent.futures.html) module to add parallelism if you can. + The [map](https://docs.python.org/3/library/concurrent.futures.html#concurrent.futures.Executor.map) function is particularly easy to use. + Use it with a [ThreadPoolExecutor](https://docs.python.org/3/library/concurrent.futures.html#concurrent.futures.ThreadPoolExecutor) if the parallel tasks are IO-bound or invoke other programs, or with [ProcessPoolExecutor](https://docs.python.org/3/library/concurrent.futures.html#processpoolexecutor) if they perform significant CPU work in Python (to avoid the [GIL](https://wiki.python.org/moin/GlobalInterpreterLock)). + * Consider using the standard [logging](https://docs.python.org/3/library/logging.html) module with a format that uses a timestamp, so users can inspect how much time is spent in different parts of the program. + You can also use logging module to implement flags such as `--debug` and `--verbose`. +* Although fancier tools exist, the standard [argparse](https://docs.python.org/3/library/argparse.html) module is good enough for most argument parsing. + It has decent support for [sub-commands](https://docs.python.org/3/library/argparse.html#sub-commands), and the linked document describes a very nice pattern to define functions for sub-commands, under "One particularly effective way of handling sub-commands..." + Provide help text for non-obvious parameters. + argparse supports a lot of different argument types with a lot of functionality out of the box, such as enumerated options, integers, and file names. + The main reason for using a fancier argument parsing is that argparse does not have autocomplete support, but you can add [argcomplete](https://github.com/kislyuk/argcomplete) to an argparse program with minimal modifications to retrofit autocomplete. +* Remember that the standard [json](https://docs.python.org/3/library/json.html) module is built-in. + You can use it to add a mode to your tool that generates JSON output instead of human-readable output, for easy automation of your tool, maybe using [jq](https://stedolan.github.io/jq/) or [fx](https://github.com/antonmedv/fx). +* Use the standard [subprocess](https://docs.python.org/3/library/subprocess.html) module to execute other commands. + * Remember never to use `shell=True`, so among other things, your tool will work correctly with files using spaces in their names. + * Use `check=True` so if the subprocess fails, an exception will be raised. + This is likely the best default behavior, although the error is a bit ugly, this normally prevents ugly problems and it's a safe option. + +You can find examples for many of those techniques in my [repos](https://github.com/alexpdp7?tab=repositories&q=&type=&language=python&sort=). diff --git a/programming/python/dependency_handling.md b/programming/python/dependency_handling.md new file mode 100644 index 00000000..3f1db103 --- /dev/null +++ b/programming/python/dependency_handling.md @@ -0,0 +1,116 @@ +# Some brief notes about Python dependency management + +This article is mostly written for people who have already used Setuptools and have faced issues derived from its "limitations". +Specifically, if you have seen files named `requirements.txt` and have wondered how they work, what problem do they solve, and if they are something you should investigate, I hope you find this article interesting. + +If you are starting to write Python software and you are looking at an introductory text about distributing your software and using dependencies, I would recommend you to skip directly to using the "new generation" Python packaging tools. +This way, you can avoid most of the complexities in this post. +You can also check out the [Python Packaging User Guide](https://packaging.python.org/en/latest/) and [my own prescriptive project setup recommendations](project_setup.md). + +Most programs can use third-party libraries to implement parts of their functionality without implementing everything from scratch. + +pip is the recommended package installer for Python. +Python installers include pip, although pip is a component that can be installed separately from Python. +Some Linux distributions separate pip from the main Python package (for example, Debian has a `python3` package and a `python3-pip` package), but a Python install without `pip` is not really fully functional for many purposes. + +pip fetches Python packages from diverse sources and adds them to a Python installation. +Python packages can specify other packages as dependencies, so when pip installs a package, it also installs the required dependency chain. + +The traditional mechanism for packages to specify dependencies is Setuptools and other closely related projects. + +## About Setuptools + +Setuptools is a build and distribution system based on the distutils module that was part of the base Python library. + +Package metadata in Setuptools can be defined in many different ways, such as a `setup.py` file, a `setup.cfg` file, or a `pyproject.toml` file. +In these files, you list the dependencies for your package, specifying the name of the package and constraints. + +Constraints define which version of a dependency you want to use. +The constraint does not be an exact version, it can also be a range of versions, or a constraint such as "lower than version n". + +(Constraints additionally can specify other restrictions, such as requiring different versions for different Python versions, and other interesting possibilities.) + +When using setuptools and dependencies using setuptools, you quickly can run into problems. + +If packages specify exact dependency versions, then there are many changes of packages having conflicting requirements. + +If packages do not specify exact dependency versions, then the actual versions that pip installs can vary as new versions of packages are released. +This can lead to bugs, because code might not work properly when using newer versions of dependencies. + +## Version locking and `requirements.txt` + +There is a dependency-management approach that can be very effective in many cases. + +This approach involves differentiating between "applications" and "libraries". + +Libraries are Python packages meant to be used as a dependency by other Python code. +Applications are Python code that may use other libraries as dependencies, but which no other Python code depends on. + +### Specifying dependencies for libraries + +Libraries specify coarse but safe dependency requirements. + +Suppose that we are developing the foo library. +The foo library depends on the bar library. +The bar library uses a versioning scheme similar to semantic versioning. +When we develop the foo library, we use version 1.2.3 of the bar library. + +Then, we specify that the foo library depends on the bar library, with a version constraint like `>=1.2.3, <1.3`. +This version constraint lets the library to be used with the 1.2.4 version, which is likely compatible with the code in the foo library, and even introduce valuable bug fixes. +However, the 1.3.0 version of the bar library would not be a valid dependency. +This is probably a good idea; the 1.3.0 may contain changes that the foo code is incompatible with. +(When we later create new versions of the foo library, we may want to consider depending on newer versions of the bar library, and possibly update the code so it continues working correctly.) + +This helps reduce conflicts. +As libraries specify coarse dependencies, the chances of two libraries having incompatible requirements is lower. +However, specifying coarse dependencies probably requires more testing to ensure that if different dependency versions are installed, the library works correctly. + +### Specifying dependencies for applications + +Applications specify exact dependency requirements. + +While libraries are not usually run on their own, applications are executed directly by end users. +If a library does not work well, then you can temporarily go back to an older version or apply other fixes. +But if an application does not work correctly, you have worse problems. + +If you specify exact dependency versions for an application, users of the application will always use a single combination of dependencies, which makes making things robust easy. + +A popular approach is for applications to specify Setuptools requirements with coarse versioning (just like libraries do), but to provide a list of the specific versions used for development and deployment. +To create this list of dependencies, you can install your application using pip or some other mechanism, then extract a list of the dependency versions that were installed and store it in a file. +For example, you can do this by executing: + +``` +$ pip install . # executed from the root of the application source code +$ pip freeze >requirements.txt +``` + +Later on, if you install the application using the following command: + +``` +$ pip install -r requirements.txt +``` + +Then you will always install the same set of dependencies, preventing issues by updated dependencies. + +Note: pip and other package installers do *not* use `requirements.txt` or any other similar file outside the `setup.cfg` file and the other files defined in Setuptools. +If you do not install your application explicitly using `pip install -r requirements.txt`, you will probably install a different set of dependencies. + +## Beyond version locking + +Following the approach above can be enough to use dependencies correctly. + +However, maintaining the Setuptools version dependencies and `requirements.txt` is straightforward, but tedious. +Also, this approach of dependency management is not obvious, and may not be easy to get right completely. + +For these reasons, several projects have appeared that implement approaches similar to the one described above, but more automatic and prescriptive. +These projects often manage automatically a file equivalent to `requirements.txt`, while the developer only specifies coarse dependencies for applications. + +Some of these tools are listed by [a page about relevant projects about packaging](https://packaging.python.org/en/latest/key_projects/) maintained by the [Python Packaging Authority](https://www.pypa.io/). +Look for tools about managing dependencies and packaging. + +Thanks to some improvements in the Python ecosystem, pip can nowadays install dependencies using many different packaging tools correctly. + +These projects can also offer some other improvements, so I would encourage Python developers to investigate them and try them out. + +However, also note that following a correct approach, Setuptools and manual version locking are perfectly valid ways to manage Python code dependencies. +Also, there are projects such as [pip-tools](https://github.com/jazzband/pip-tools) that complement Setuptools, addressing many of the issues described here, without requiring entirely new packaging tools. diff --git a/programming/python/project_setup.md b/programming/python/project_setup.md new file mode 100644 index 00000000..e945be71 --- /dev/null +++ b/programming/python/project_setup.md @@ -0,0 +1,117 @@ +There is a significant amount of Python project tooling. This document collects my personal recommendations on how to set up a Python project. + +It is not meant to reflect the best or most common practices, just my personal taste. + +# Use pipx + +Pipx is a tool that installs Python packages to your user environment. It creates an isolated environment for every tool, so if you install multiple packages they won't have version conflicts. It also takes care of adding a module's entrypoints to your user path. + +Pipx is useful for two purposes: + +* To install tools such as poetry +* To let other users install your software easily + +# Use Poetry + +When using third-party dependencies in your Python code, it is highly interesting to avoid installing any project-specific dependency outside the project. + +To achieve that, traditionally virtualenvs are used; those are miniature Python installations where you can install any library you want. Virtualenvs need to be explicitly activated to be used, so it is easy to have a virtualenv for each Python project you are working on. + +Poetry is a tool that leverages virtualenvs to manage a project's dependencies, managing virtualenvs automatically. + +There are many similar tools such as pipenv and there are many multiple ways to specify a project's dependencies (`setup.py`, `requirements.txt`, etc.); Poetry provides a convenient way to do everything. + +You can install poetry using pipx. + +Commit `poetry.lock` to version control. For runtime dependencies, specify bounded dependency ranges. For development dependencies, use unbounded dependencies. + +# Test your code + +Write the necessary amount of tests so you can make changes to your code with confidence. + +If you find yourself iterating over a piece of code slowly, try to isolate the code you are writing so it can be tested in isolation for faster iteration. + +## Use pytest for testing + +Python provides *two* testing frameworks in its standard library, but they have some limitations: + +* `unittest` is an xUnit-style testing framework which follows non-PEP-8 naming conventions (probably because it copied the Java's jUnit), so extra work needs to be done to make your test cases PEP-8 compliant +* `doctest` is a tool which allows you to run tests embedded in comments. For some code, it is great and helps you provide good, up-to-date documentation. However, a significant amount of code is awkward to test using `doctest`. + +Use `doctest` whenever you can, but outside that, use `pytest` to write PEP-8-compliant tests. + +Ensure that your test suite runs correctly by running `pytest` without any arguments. + +Use plain Python's `assert` statements to check assertions in your tests; `pytest` does some magic to provide nice error messages on failed assertions. + +## Gate your changes with testing + +Set up your version control so changes cannot be made to your main codeline without passing continuous integration tests (and possibly, code review). + +# Perform automated code formatting and static checking + +## Use Black + +Use Black to format your code. + +## Use flake8 + +Use `flake8` to gate changes. Use `flake8-black` to prevent committed code which does not follow Black style. + +## Evaluate the use of mypy + +If you think it will benefit your codebase, consider integrating mypy as soon as possible. + +# Version control + +## Use a minimal gitignore file + +Keep editor-specific ignores in a personal `excludesfile`. Do not include patterns in gitignore which do not match anything generated by documented and supported development procedures. + +## Keep your code together + +All the code you modify as part of the project should be kept in a single repository so you can make atomic changes. If you find yourself making changes across multiple repositories and having to coordinate them, consider merging those repositories. + +Use git submodules or similar mechanisms to refer to code you modify that must be kept external. + +Use git subrepo to publish parts of the repository outside the main repository if needed. + +# Support multiple modern versions of Python + +Unless you have a specific requirement to support Python 2, don't. + +It is reasonable to support multiple versions of Python 3 from 3.4 onwards. Supporting the oldest versions might limit the features you can use (although features from more modern versions have been backported), so evaluate which operating systems and versions you need to support and try to support Python versions readily available for them (in Linux, by using mainline distro repos, for instance). + +Even if you are not running your code using the latest versions of Python, try to support all the newest available versions. + +Use continuous integration to run your tests in all supported versions of Python. + +This implies that development should be possible to do without using a specific version of Python, so pyenv or similar is not strictly needed. + +# Use ipython and ipdb + +Add ipython and ipdb as development dependencies. + +# Versioning + +Unless you have a specific requirement to support multiple versions of your code or to distribute to a platform that *requires* versioning (such as pypi), do not explicitly version your code but allow implicit versioning (e.g. it should be possible to identify which Git commit deployed code comes from). + +# Documentation + +Provide a `README` containing: + +* The purpose of the code +* How to use the code +* How to develop the code + +If the `README` becomes unwieldly, separate usage instructions to `USAGE` and/or development instructions to `HACKING`. + +Provide docstrings detailing the external interface of Python modules. Provide internal comments in modules detailing implementation. + +Consider the use of Sphinx to render documentation and publish it to the web if developing a library/framework. + +# Distribution + +If your code can be executed from a command line, consider documenting installation via `pipx`. + +If your code has significant binary dependencies, consider publishing a Docker image. Design your Docker images so rebuilding the image on most changes is fast. diff --git a/programming/python/python_modules_primer.md b/programming/python/python_modules_primer.md new file mode 100644 index 00000000..8932c19f --- /dev/null +++ b/programming/python/python_modules_primer.md @@ -0,0 +1,269 @@ +# Python Modules Primer + +## Prerequisites + +These instructions assume a Linux environment. +A macOS environment is similar, but not identical. +A Windows environment is more different. + +## Previous knowledge + +### A refresher on the `PATH` variable + +If you execute the following command in your terminal: + +``` +$ echo hello +``` + +, the shell searches for the `echo` command in the directories listed in your `PATH` environment variable. +You can display your `PATH` variable by running: + +``` +$ echo $PATH +/home/user/.local/bin:/home/user/bin:/usr/share/Modules/bin:/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin +``` + +The contents of the `PATH` variable depend on your particular environment. + +If you run the following command: + +``` +$ which echo +/usr/bin/echo +``` + +The `which` command prints where the shell locates the `echo` command. + +### A refresher on shell scripts + +If you create a file named `foo.sh` with the following contents: + +``` +#!/bin/sh + +echo hello +``` + +You define a "shell script". +The first line indicates that this shell script is executed by using the `/bin/sh` command. +The rest of the file are commands to be executed by the shell command. +These commands behave as if you typed them into your terminal, so if you execute this script, the command `echo hello` will be executed, printing `hello`. + +If you try to run `foo.sh` like you run the `echo` command, by typing its name, it does not work: + +``` +$ foo.sh +bash: foo.sh: command not found... +``` + +, because the shell looks for the `foo.sh` in the directories listed in the `PATH` variable. +Unless you created the `foo.sh` file in a directory like `/usr/bin`, the shell will not find the `foo.sh` command. + +A solution to this problem is to specify the path to the `foo.sh` file, instead of relying on the `PATH` variable. +However, if you do this, you face a second problem. + +``` +$ ./foo.sh +bash: ./foo.sh: Permission denied +``` + +This happens because only files with the executable permission can be executed in this way. +To solve this, add the executable permission; then it works: + +``` +$ chmod +x foo.sh +$ ./foo.sh +hello +``` + +## The `import` statement in Python + +### Importing from the Python standard library + +Run the following commands by using the Python REPL: + +``` +$ python3 +Python 3.9.17 (main, Aug 9 2023, 00:00:00) +[GCC 11.4.1 20230605 (Red Hat 11.4.1-2)] on linux +Type "help", "copyright", "credits" or "license" for more information. +>>> import datetime +>>> datetime.datetime.now() +datetime.datetime(2023, 9, 11, 21, 53, 16, 331236) +``` + +`import` works in a similar way to running a command in the shell. +Python searches a number of directories looking for the `datetime` module. + +To see which directories are searched, run: + +``` +$ python3 +>>> import sys +>>> sys.path +['', '/usr/lib64/python39.zip', '/usr/lib64/python3.9', '/usr/lib64/python3.9/lib-dynload', '/home/alex/.local/lib/python3.9/site-packages', '/usr/lib64/python3.9/site-packages', '/usr/lib/python3.9/site-packages'] +``` + +`sys.path` is a list of the directories that the `import` command searches. +The contents of `sys.path` depend on your operating system and Python installation method. + +In my system, the `/usr/lib64/python3.9` directory contains the `datetime.py` module. + +``` +$ head /usr/lib64/python3.9/datetime.py +"""Concrete date/time and related types. + +See http://www.iana.org/time-zones/repository/tz-link.html for +time zone and DST data sources. +""" + +__all__ = ("date", "datetime", "time", "timedelta", "timezone", "tzinfo", + "MINYEAR", "MAXYEAR") +... +``` + +`/usr/lib64/python3.9` contains the modules in [the Python standard library](https://docs.python.org/3/library/). + +### Importing your Python files + +If you create a file with the `a.py` name: + +``` +def f(): + return 2 +``` + +, and another with the `b.py` name: + +``` +import a + +print(a.f()) +``` + +, then: + +``` +$ python b.py +2 +``` + + +This works, because `sys.path` contains `''`, which means "the current directory". + +(`sys.path` is very similar to the `PATH` variable. However, `sys.path` contains the current directory by default, whereas `PATH` does not.) + +When `import a` is executed, then Python searches the directories in `sys.path` for an `a.py` file; it is found when checking the `''` path. +When `import datetime` is executed, Python searches in the current directory (because `''` comes first in the path), doesn't find it, but then finds it in the following `/usr/lib64/python3.9` directory. +Python iterates over the `sys.path` directories, and loads the *first* matching file. + +## Installing libraries + +When writing Python software, sometimes it is enough with the modules included in the standard library. +However, frequently you want to use other libraries. +To use Python libraries, you must install them using the `pip` program. + +The `pip` program is not part of the `python3` package in some Linux distributions, and comes from the `python3-pip` package. + +The `pip` program can download libraries from https://pypi.org/ , the Python package index, and install them. +`pip` installs libraries to a "Python environment". + +Old versions of `pip` defaulted to installing libraries to the "system" Python environment. +In a Linux system, the system Python environment is located in a directory such as `/usr/lib64/python3.9`. +By default, normal Linux users cannot write to `/usr`, so installing a package would fail. + +Modern versions of `pip` detect that they cannot write to the "system" Python environment, and then redirect the install to the "user" Python environment. +The "user" Python environment is in a directory such as `~/.local/lib/python3.9`. + +You could use a command such as `sudo pip install` to grant `pip` the privileges required to write to `/usr`. +However, this can make a Linux system unusable. +Most Linux systems use software that uses the "system" Python environment. +Altering the "system" Python environment can break such software. +Do not run `sudo pip install` with root privileges unless you know why you need this. + +If you use a modern `pip` (or use the `--user` option), you can install libraries to the "user" Python environment. +However, this is problematic because a Python environment can only contain a single version of a Python library. +If you have two different Python programs that different versions of the same library, then these two programs cannot coexist in the "user" Python environment. + +In general, Python virtual environments are used to address this problem. + +## Creating Python virtual environments + +If you run: + +``` +$ python3 -m venv <some path> +``` + +This will create a directory with the path you specify, with the following contents: + +``` +<some path> +├── bin +│ ├── activate +│ ├── pip +│ ├── python +├── include +├── lib +│ └── python3.9 +``` + +The `python` and `pip` commands are copies of the same commands from the "system" Python environment. + +But these commands work differently from the "system" Python environment commands: + +``` +$ <some path>/bin/python +>>> import sys +>>> sys.path +['', '/usr/lib64/python39.zip', '/usr/lib64/python3.9', '/usr/lib64/python3.9/lib-dynload', '<some path>/lib64/python3.9/site-packages', '<some path>/lib/python3.9/site-packages'] +``` + +`sys.path` uses the `lib` directories in the virtual environment. + +When you use the `pip` program from the virtual environment, it installs the libraries to the virtual environment. + +You can create as many virtual environments as you need, and you can install different versions of libraries to each virtual environment. + +## Activating Python environments + +You can run the `python` and `pip` commands by specifying the full path, like we did when executing the `foo.sh` command earlier. + +By default, if you run `python`, the shell will invoke the `python` command from the "system" Python environment because it is in a directory included in the `PATH` variable. +If you specify the full path, you override this. + +To save typing, the `bin` directory of a virtual environment contains an `activate` file. +The `activate` file is a "special" shell script that must be invoked in one of the following two ways: + +``` +$ source <some path>/bin/activate +``` + +``` +$ . <some path>/bin/activate +``` + +`source` and `.` are synonyms. +They are special shell commands that are needed for the `activate` command to work correctly. + +`activate` alters your path, so that the `bin` directory in your virtual environment comes first in your path. + +``` +$ echo $PATH +/home/user/.local/bin:/home/user/bin:/usr/share/Modules/bin:/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin +$ . <some path>/bin/activate +(some path) $ echo $PATH +<some path>/bin:/home/user/.local/bin:/home/user/bin:/usr/share/Modules/bin:/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin +``` + +, and thus if you run `python`, `<some path>/bin/python` will be executed instead of `/usr/bin/python`. + +Besides changing your prompt to indicate the virtual environment is activated, `activate` only alters your `PATH`. +You can never use `activate` if you always specify the path to the virtual environment commands. + +## Further reading + +* [Some brief notes about Python dependency management](dependency_handling.md) continues this explanation, introducing the need for packaging tools. +* [Installing Python Modules](https://docs.python.org/3/installing/index.html), from the official Python documentation, describes the `pip` program in more depth. +* [`venv` - Creation of virtual environments](https://docs.python.org/3/library/venv.html), from the official Python documentation, describes virtual environments in more depth. diff --git a/programming/python/scraping_with_selenium_on_docker.md b/programming/python/scraping_with_selenium_on_docker.md new file mode 100644 index 00000000..61ba1c12 --- /dev/null +++ b/programming/python/scraping_with_selenium_on_docker.md @@ -0,0 +1,7 @@ +Don't use Selenium, use [Playwright](https://playwright.dev/python/): + +* Playwright automatically sets up headless browsers. +* Provides convenient abstractions for locating elements in a page (mostly no XPath required. It can match "intelligently" using text). +* Has a handy UI tool that records your actions in a browser and writes equivalent *readable* Playwright code. + +Further reading: https://new.pythonforengineers.com/blog/web-automation-dont-use-selenium-use-playwright/ |
