Skip to content
View othyn's full-sized avatar
🏳️‍🌈
Be kind, and have an adventure.
🏳️‍🌈
Be kind, and have an adventure.

Organizations

@KammaData

Block or report othyn

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
othyn/README.md

Ahoy-hoy! 👋

You'll find all sorts in here - to be slightly more coherent than /dev/urandom, that's the goal. I would say private or proprietary repo's are where the treasures are at, but depends on what you class as 'treasure', yarr... yarrr.

Be kind, and have an adventure. ♥️

Pinned Loading

  1. macos-auto-clicker Public

    A simple auto clicker for macOS Big Sur, Monterey, Ventura, Sonoma and Sequoia.

    Swift 300 37

  2. github-folder-icon-macOS Public

    GitHub folder icon for use in macOS. Made so that it fits nicely alongside all the others inside your home directory. Mac OS X 10.5 (Leopard) through to macOS 15 (Sequoia).

    Makefile 67

  3. go-calendar Public

    A community driven auto-updating Pokémon GO events calendar that you can subscribe to in any calendar app on your phone/PC. Powered by Leek Duck.

    PHP 40 12

  4. Setting up a local only LLM (Qwen/Ll...
    1
    # Setting up a local only LLM (Qwen/Llama3/etc.) on macOS with Ollama, Continue and VSCode
    2
    
                  
    3
    As with a lot of organisations, the idea of using LLM's is a reasonably frightning concept, as people freely hand over internal IP and sensitive comms to remote entities that are heavily data bound by nature. I know it was on our minds when deciding on LLM's and their role within the team and wider company. 6 months ago, I set out to explore what offerings were like in the self-hosted and/or OSS space, and if anything could be achieved locally. After using this setup since then, and after getting a lot of questions on it, I thought I might share some of the things I've come across and getting it all setup.
    4
    
                  
    5
    Que in [Ollama](https://ollama.com/) and [Continue](https://marketplace.visualstudio.com/items?itemName=Continue.continue). Ollama is an easy way to locally download, manage and run models. Its very familiar to Docker in its usuage, and can probably be most conceptually aligned with it in how it operates, think images = llm's. Continue is the other side of that, being the bridge/interconnect to allow what's in VSCode to talk with Ollama in a way that makes sense.
  5. Fix horrendously bad macOS (12.3.1 t...
    1
    # Intro
    2
    
                  
    3
    Out of the box, my SMB performance on macOS 12.3.1 would top out at around 20MB/s in short ~5 second bursts, which was absolutely horrendous, slow to navigate in Finder and slugish to interact with.
    4
    
                  
    5
    Since making these changes, I now get sustained ~80-100MB/s+ and instant Finder navigation which is superb and how things should be out-of-the-box (OOTB)!
  6. How to disable default menu bar item...
    1
    //
    2
    //  App.swift
    3
    //
    4
    //  Created by Ben Tindall on 30/03/2022.
    5
    //

2,330 contributions in the last year

Contribution Graph
Day of Week March April May June July August September October November December January February March
Sunday
Monday
Tuesday
Wednesday
Thursday
Friday
Saturday
Less
No contributions.
Low contributions.
Medium-low contributions.
Medium-high contributions.
High contributions.
More

Activity overview

Contributed to othyn/go-calendar, othyn/EchoLink, othyn/macos-auto-clicker and 16 other repositories
Loading A graph representing othyn's contributions from March 24, 2024 to March 25, 2025. The contributions are 93% commits, 5% code review, 1% pull requests, 1% issues. 5% Code review 1% Issues 1% Pull requests 93% Commits

Contribution activity

March 2025

Created 32 commits in 1 repository
Created 1 repository

Created a pull request in LubergAlexander/plexmuse that received 2 comments

LLM selection (depends on #2)

LLM Model Selection Feature (depends on #2) Overview This PR adds the ability to select different LLM models when generating playlists, allowing us…

+582 −295 lines changed 2 comments
Opened 2 other pull requests in 2 repositories

Created an issue in LubergAlexander/plexmuse that received 1 comment

Enable repo packages please

Could you enable packages on the sidebar of your repo please to make the Docker image builds more visible? I was going to clone the repo locally an…

1 comment
190 contributions in private repositories Mar 1 – Mar 23
Loading