- The droplet was slogging along.
- Ran docker system prune –all, reclaimed 7.5G.
- There is only one core, it’s usually railed at almost 100% when deploying, handling a request, etc. It idles at 1-2% otherwise.
- There’s only 1G RAM. There’s about 24G disk space. 0 swap.
- Tried moving the wheelbuild from the intermediate container to a simple install in the final container. Still would hang.
- sudo reboot. Wanted to restart the docker service but just did a whole system update anyway.
- (unrelated) cleared the release updater with:
- sudo truncate -s 0 /var/lib/ubuntu-release-upgrader/release-upgrade-available
- sudo apt update && sudo apt upgrade
- sudo apt full-upgrade
- sudo apt autoremove
- `sudo update-rc.d docker defaults` to make docker start on boot.
- After system restart, mem went down to about 200M, 20%. It was at about 80% before, even before building the image.
- Tried rebuilding the image for supercontest, worked this time. Looks like it was just system overload (I’m straining this tiny machine). A restart/upgrade every now and then won’t kill us.
- docker-compose restart supercontest-app-prod clears a decent amount of mem.
- docker-compose down before running the new build also clears a lot of space.
- The flask monitoring dashboard has a db backend, obviously. It’s only 4M right now.
- /proc/sys/vm/swappiness is how aggressively your system will use swap. It’s configurable. 60 is the usual.
- Pruned my laptop’s docker system as well. Reclaimed 21GB.
- Lost both fantasy games, both pretty badly.
- Supercontest.
- Added a persistent (named) docker volume for the flask monitoring dashboard. It was getting wiped new on every deploy. Confirmed it stayed.
- Added ipython to the dev image so that the manage.py shell opens it by default.
- Did some backlog cleaning and milestone organization. Issue count is in a good place.
- Centered the headers on the rules markdown. Swapped the picks/points column on the all-picks view so the colored points cells would stand out, not adjacent to the full main table.
- With Monday, Sunday, and Thursday games combined, there are ~18 hours of active scoring during the week. This is a little over 1000 minutes. If I fetch the nfl scoresheet xml once a minute, that’s 1000 requests a week. Totally handle-able. With our current user base of ~50, and each user opening the app (or different views within it) about 10 times a week, we’re already at half that traffic. Much better to control this and keep it static around 1000, rather than proportional to our user base as we scale.
- With a backref or backpopulating relationship in sqlalchemy, graphene will break unless you name your SQLAlchemyObjectType classes distinct from the model name. I.e., Week becomes WeekNode.
- Relationship changes are just in the ORM. They don’t change any of the raw db, no sql, no migration/upgrade required.
- Substantially cleaned the makefile and admin.md.
- Added backrefs for all existing FKs in the models. There’s a lot more information now!
- I don’t think you can filter on a backref with graphene. Still, it’s useful to be able to see all the info (basically a join) in graphiql.
- The automigration for the table creation of league and league_user_association was easy. The only manual changes to the migration file were the creation of the 2018 and 2019 paid leagues, then adding users to them.
- To find all the users from 2018:
- from supercontest.dbsession.joins import join_to_picks
- all_picks = join_to_picks()
- results = all_picks.filter(Season.season == 2018).distinct(User.email).all()
- emails = [user.email for _, user, _, _, _, _ in results]
- In the migration file, I create the league table and the league_user association, then I find all the users and add them appropriately. This migration was pretty cool.
- Added full league functionality.
- league0 is the free league. It’s not an actual league row, it’s just “don’t do any filtering for league”. It’s the same as the app is today, before the league change. This allows users to play for free, any year they want.
- The url_for(request.url_rule.endpoint, *args)) calls are layered. The season navs and league navs are lumped. They pass season and league, because they’re the top level. The week navs pass season, league, and week, because they’re the lowest. The information beneath (like what tab you’re on: matchups or lb) is passed through request.url_rule.endpoint. You do not need to provide any lower information; ie league_navs don’t have to pass week or main tab, and main navs don’t have to pass week.
- The nav_active js obviously matches the element’s id to the route, not the displayed text. This allows you to show whatever you want while doing unique endpoint-nav matching.
- Use g.<attr> in the templates, that’s what it’s for. The redefinition to variables in layout.html is for javascript only, not use in html.
- Rewrote the url_defaults and value_preprocessor to be MUCH clearer about the actions each is taking, why, and when.
- The hardest part of the league change was handling all the navs and links moving everywhere properly. This is a little easier in a node app with something like react/redux, where you have a full state container.
- Did some cool conditional behavior on the matchups view. If a league was passed (via route explicitly, which flask must handle), it strips it and redirects to the leagueless matchups endpoint. The python side can parse and handle the values as args/kwargs at will, but if an url is fetched from a server, there MUST be a @route to handle it. You can’t just url_defaults or url_value_preprocessor away it, you must add a @route to handle it first. Then you can redirect / add data / remove data as desired.
- Turned on the activity overview (commits vs code reviews etc) for my github profile.
- Ansible (even with -vvv and stdout_lines assigned to debug vars and such) does not print the stdout and stderr of the remote commands back on the host machine. If you wanna watch, ssh in and do it.
- Reorganized and cleaned the kitchen, moving the machines under the shelves and removing the mat. Looks much better now. Cleaned the fridge/freezer too. Made new batch of oat milk. Starting soaking a large batch of garbanzo beans. Bottled the green tea hibiscus kombucha without a second fermentation. Threw away the scoby – I don’t plan on making another batch soon.
- Made another batch of protein bars. Just pecans, oats, and protein powder in this one. Delicious. Better than the tahini base.
- I didn’t think about it until after, but add some oat milk to the next batch. It will make them stick together better, allow you to add more protein powder, and impart a little taste.
- Gbro didn’t finish the 1/2 cup over the course of the day. I’ll leave it at the same serving for now (1/2c once a day instead of twice). Online resources say that appetites vary, but a healthy cat weight is one where you can “easily feel the ribs”. We’re not close.
- GitLab.
- https://www.youtube.com/watch?v=nMAgP4WIcno.
- Created an account, started to look through the full capabilities.
- Combines Stash (SCM), JIRA (tickets), and Bamboo (CI/CD).
- It’s not an extension to github, it’s a full alternative. It’s a competitor.
- You can edit in browser, merge requests (they call them the correct thing, not pull requests!), create boards, sprint plan, etc.
- You get a lot of CI configuration right out of the box. It infers your language, creates a container, runs static analysis, checks licenses, runs security tests – all without you doing anything. You can, of course, customize all of this. They have advanced stages for dynamic testing, CSRF, and more.
- It autocomments a lot of helpful information on the MR. Performance change, code quality deltas, etc.
- CD has different environments, you can do partial deployments, more.
- Microsoft bought GitHub last year for 7.5b.
- I’m going to finish these few big tickets and then I’ll migrate all my projects/repos over from github to gitlab.
- https://www.programcreek.com/python/ is an awesome website. It basically just scrapes open source python projects for whatever you’re looking for. In most cases, you’ll get a working implementation of the string you’re curious about.