Life hack for github-cli gh repo clone $(printf 'squidowl/halloy%.0s ' {1..2})
, i.e. clone into owner/repo directory, not just repo. #github
EDIT:
I wrote a small #bash (obviously works also for #zsh) function to make cloning easier in my environment:
gh-repo-work() {
local url=$1
# Strip path from URL:
local url_path=${url#*\.*/}
# Clone to the Github tree:
gh repo clone $url_path "$HOME/work/github/$url_path"
}
“Demo”:
~ main*
❯ gh-repo-work m4b/goblin
Cloning into '/Users/jarkko/work/github/m4b/goblin'...
remote: Enumerating objects: 7261, done.
remote: Counting objects: 100% (1215/1215), done.
remote: Compressing objects: 100% (326/326), done.
remote: Total 7261 (delta 977), reused 922 (delta 889), pack-reused 6046
Receiving objects: 100% (7261/7261), 3.22 MiB | 4.69 MiB/s, done.
Resolving deltas: 100% (5565/5565), done.
~ main*
❯ ls -1 work/github/m4b/goblin
CHANGELOG.md
Cargo.toml
LICENSE
Makefile
README.md
assets
etc
examples
fuzz
fuzz-afl
src
tests
~ main*
❯ rm -rf work/github/m4b/goblin
~ main*
❯ gh-repo-work https://github.com/m4b/goblin
Cloning into '/Users/jarkko/work/github/m4b/goblin'...
remote: Enumerating objects: 7261, done.
remote: Counting objects: 100% (1227/1227), done.
remote: Compressing objects: 100% (337/337), done.
remote: Total 7261 (delta 988), reused 923 (delta 890), pack-reused 6034
Receiving objects: 100% (7261/7261), 3.23 MiB | 6.68 MiB/s, done.
Resolving deltas: 100% (5564/5564), done.
~ main*
❯ ls -1 work/github/m4b/goblin
CHANGELOG.md
Cargo.toml
LICENSE
Makefile
README.md
assets
etc
examples
fuzz
fuzz-afl
src
tests
The list of papers accepted at the 2nd #eBPF workshop has been published by ACM: https://dl.acm.org/doi/proceedings/10.1145/3672197#tableOfContent.
Trump to the press: I will crush you like the bugs you are.
Journalists: Haha there goes Trump again.
Biden: I'm incredibly disappointed in your coverage:
Journalists: How dare you? Resign immediately, you ungrateful pathetic SOB.
Flow planned for my cheapo #BPF flame graph for a single driver:
The host side then consumes the fixed-size packets and puts matching stacks to the same bucket. A second thread can periodically then compose flame graph of the data corrected so far.
Somehow got into learning this eBPF stuff during the holidays :-) Super interesting and addicting.
GitHub's co-founder and former CEO launched the Ladybird initiative, a brand-new independent browser written from scratch and backed by a non-profit.
https://linuxiac.com/ladybird-is-a-new-browser-initiative-backed-up-by-1m/
I think, just based on experience on previous tech revolutions, that #AI is neither useless nor it is going to repeal and replace human labor.
It just hasn’t hit the its roof, or more precisely constraints, yet.
Media only giving voice to either AI companies or AI researchers turned into doomsday predictors, is at least quite strong signal of a bubble.
If you feel that AI is evil, here’s couple of suggestions what you can do: