Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Interesting reading, I share some of the points in the post, however, one more dependency manager?

Mostly I've used plain `python -m venv venv` and it always worked well. A downside - you need to add a few bash scripts to automate typical workflow for your teammates.

Pipenv sounds great but there are some pitfalls as well. I've been going through this post recently and got a bit upset about Pipenv: https://chriswarrick.com/blog/2018/07/17/pipenv-promises-a-l...

Another point is that it does not work well with PyCharm and does not allow to put all dependencies into the project folder as I used to do with venv. (just like to keep everything in one folder to clean up it easily)

Are there any better practices to make life easier?



Actually, I recommend bash scripts for automating team workflows as a best practice.

You create a wrapper script around your application that calls a dev environment set-up script, that [if it wasn't done yet] sets up the environment from scratch for that project or application, and loads it before running your application. This does a couple things.

First, it removes the need to train anyone on using your best practices. The process is already enshrined in a version-controlled executable that anyone can run. You don't even need to 'install lore' or 'install pipenv' - you just run your app. If you need to add documentation, you add comments to the script.

Second, there's no need for anyone to set up an environment - the script does it for you. Either set up your scripts to go through all the hoops to set up a local environment with all dependencies, or track all your development in a Docker image or Dockerfile. The environment's state is tracked by committing both the process scripts and a file with pinned versions of dependencies (as well as the unpinned versions of the requirements so you can occasionally get just the latest dependencies).

Third, the pre-rolled dev environment and executable makes your CI-CD processes seamless. You don't need to "set up" a CI-CD environment to run your app. Just check out the code and run the application script. This also ensures your dev environment setup scripts are always working, because if they aren't, your CI-CD builds fail. Since you version controlled the process, your builds are now more reproducible.

All this can be language-agnostic and platform-agnostic. You can use a tool like Pipenv to save some steps, but you do not need to. A bash script that calls virtualenv and pip, and a file with frozen requires, does 99% of what most people need. You can also use pyenv to track and use the same python version.


Completely agree on every bullet point,

Every time I saw simple bash scripts or/and Makefile used - it did not seem to be the idiomatic way of doing things in python but after using it for a while - turned out to be one of the best development experiences.


> Another point is that it does not work well with PyCharm and does not allow to put all dependencies into the project folder as I used to do with venv.

This is annoying for AWS lambdas too, because you have to bundle the dependencies and zip it. It's pretty trivial to go Pipfile -> requirements.txt -> pip install -t if you use a Makefile, but it's definitely an omission. I asked about it on their github though and it is a known issue, hopefully it'll be there soon.


JetBrains have heard the prayers :D Here is an announce of pipenv support: https://blog.jetbrains.com/pycharm/2018/06/pycharm-2018-2-ea...

> because you have to bundle the dependencies and zip it btw, I've used serverless to deploy lambdas in python and it worked super cool. Highly recommended.


Oo nice I didn't know serverless worked with Python! Thanks for the heads up :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: