screen mongo encryption compression cron networking git disk shell awk docker aws cli imagemagick ffmpeg

Screen

start, reconnect and list

screen [-S name] # start a new session
screen -ls # list screens
screen -r name # reconnect to session						
screen -d -m -S name # start a new session for sharing
screen -x name # join a shared screen

Screen commands

Use Ctrl+A prefix to run commands within screen
D # disconnect from session
C # disconnect and create a new session
n # jump to screen n (0-indexed)
Shift+S # split screen horizontally
| # split screen vertically
Tab # Jump between regions of split screen
Q # Close all other regions
X # Close current region
esc # enable/disable scroll feature

Mongo

Export/Import/Dump

From the command line. (i.e. not mongo cli)
mongoexport --db <database> --collection <collection> --out <output_file.json>
mongoimport --db <database> --collection <collection> --file <input_file.json>
mongodump -d <database> -o <output_dir>

Encryption

hash a file

md5 file.txt

# Linux
sha256sum file.txt
echo -n "foobar" | sha256sum

# macOS
shasum -a 256 file.txt
echo -n "foobar" | shasum -a 256

openssl

openssl list-cipher-algorithms # Get list of available ciphers

# linux
openssl aes-128-cbc < plain.txt > cipher.txt.crypt # encrypt
openssl aes-128-cbc -d < cipher.txt.crypt > plain.txt  # decrypt

#mac
openssl aes-128-cbc -in plain.txt -out cipher.txt.crypt # encrypt
openssl aes-128-cbc -d -in cipher.txt.crypt -out plain.txt  # decrypt

openssl dgst -sha256 -binary <filename>

gpg

gpg --cipher-algo AES256 -c filename.tar.gz # encrypt
gpg -o filename.tar.gz -d filename.tar.gz.gpg # decrypt

Compression

tarballs

tar -czf <to_file>.tar.gz <from_file> # Compress a file or directory into a *.tar.gz file
tar -xvf <from_file>.tar.gz [-C /foo/bar] # decompress a *.tar.gz file

zip

zip -r myfile.zip myfolder
unzip myfile.zip

Cron

Crontab

crontab -l # print out the crontab file
crontab -e # edit the crontab file

Networking

Configuration

/etc/network/interfaces
/etc/wpa_supplicant/wpa_supplicant.conf

Commands

ifdown/ifup eth0 # restart specific network interface
/etc/init.d/networking restart #restart all interfaces
/sbin/ifconfig -a #see all interfaces
sudo iwlist wlan0 scan

DNS Lookup Utilities

nslookup <domain | ip> # query Internet name servers
host <domain>

dig +short hostname.com
dig +nssearch hostname.com
dig +trace hostname.com
dig @8.8.8.8 hostname.com
dig +short myip.opendns.com @resolver1.opendns.com # get public ip

Watch Network Traffic

sudo iftop -f "port 3000" # watch traffic on port 3000
netstat -ac 5 # show network status

Other

sudo killall -HUP mDNSResponder;sudo killall mDNSResponderHelper;sudo dscacheutil -flushcache // flush dns cache on mac

Git

Summary

<index> snapshot of proposed next commit
<branch> is a label at the tip of a sequence of commits.
<HEAD> is a label that points at a <branch> and is what the <index> will be committed on top of.
<working-directory/working-tree> are the files in your editor. It’s what you see and work with. The sandbox.

Reset

Moves <branch> label to another commit (with HEAD following <branch> label)
git reset [--soft|--mixed|--hard] [HEAD~1](default is to current HEAD) [filename]
 --soft) Move <HEAD> to a given commit
--mixed) Make <index> look like <HEAD>. (i.e. clears <index>) default
 --hard) Make the Working Directory look like <HEAD>
note: reseting a file will only work in --mixed mode.

Checkout

Moves HEAD/index/working-dir to <branch>
git checkout [branch] [filename] # updates HEAD to <branch> and will leave or attempt to merge changes in working-dir
git checkout . - # will overwrite unstaged changes with index.
git checkout 'master@{7 months ago}' -- path/to/file.txt # Restore a file

Clean

Remove files unknown to git (by default untracked files)
git clean -f (sometimes needed if git not configured in a certain way)
git clean -x also remove ignored files
git clean -X only remove ignored files
git clean -d remove untracked

Tag

git tag -l "v4*" # list all tags that start with 'v4'
git show v4.5.7 # show details of tag
git tag -a v4.5.8 -m "tag message" # create an annotated tags
git tag -a v4.5.8 2cfbe90 # add a tag to an old commit
git push origin v4.5.8
git tag --delete tagname # delete a local tag
git push --delete origin tagname # delete remote tag 

Other commands

git log --pretty=format: --name-only --since="1 year ago" # show files for each commit over the past year
git log --pretty=oneline
git log --pretty=format:"%h %an %as %s" --author=<author> path/to/file.java # show short hash, author, date and commit message for a file by an author
git merge --squash your-branch-name # squash another branch as a single commit and merge
git rebase # Reapply commits on top of another base tip
git stash # save all local tracked changes and reset HEAD
git update-index --assume-unchanged <file name> # Temporarily ignore changes to a file
git update-index --no-assume-unchanged <file name> # Undo the above
git commit --amend -m "an updated commit message"
git commit --amend --no-edit
git add --patch <filename> # Stage a file in chunks. [y/n/q/a] (See here for more info)

Disk Commands

Disk Usage

df -h # display free disk space
du -sh # display disk usage statistics
du -h --max-depth=1
lsblk

Volumes

sudo mkfs.ntfs -f /dev/sdd1 # format disk to NTFS
# To use a disk across mac/linux/windows format it on mac to exfat. On ubuntu install exfat-fuse exfat-utils

sudo mount /dev/sdd1 /media/usb # mount disk

sudo umount /dev/sdd1 # unmount disk
sudo umount -l /dev/sdd1 # lazy unmount. if disk wont unmount.
sudo umount -f /dev/sdd1

Memory

top # display and update sorted information about processes
free -g # memory allocation
mvstat -s
dmidecode
less /proc/meminfo # (above commands read this file)

AWS CLI

S3

aws s3 mb 's3://bucket-name' # make bucket
aws s3 rb 's3://bucket-name' [--force] # remove bucket
aws s3 ls ['s3://bucket-name/path']
aws s3 cp local.txt 's3://bucket-name/remote.txt'
aws s3 rm 's3://bucket-name/remote.txt'
aws s3 mv 's3://bucket-name/path' ./dir --include '*.jpg' --recursive
aws s3 sync ./dir 's3://bucket-name/path' [--delete]
aws s3 presign 's3://bucket-name/path' --expires-in <seconds> # generate a temporary signed url

glacier

aws glacier list-vaults --account-id -
aws glacier create-vault --account-id - --vault-name <my_vault>
aws glacier describe-vault ...

aws glacier list-multipart-uploads ...
aws glacier abort-multipart-upload ... --upload-id <upload_id>

aws glacier initiate-job ... --job-parameters '{"Type": "inventory-retrieval"}'
aws glacier describe-job ... --job-id ""
aws glacier get-job-output ... --job-id "" outfile.json

aws glacier delete-archive ...
aws glacier delete-vault ... # Vault must be empty before you delete it

aws glacier list-multipart-uploads ...

dynamodb

aws dynamodb create-table --table-name <my_table> --attribute-definitions AttributeName="string",AttributeType="[S|N|B]"
	--key-schema AttributeName="string",KeyType="[HASH|RANGE]" --provisioned-throughput ReadCapacityUnits=long,WriteCapacityUnits=long
aws dynamodb list-tables
aws dynamodb describe-table --table-name <my_table>
aws dynamodb put-item --table-name <my_table> --item '{"string":{"N":"1"},"string":{"SS":["the", "quick", "brown", "fox"]}}'
aws dynamodb get-item --table-name <my_table> --key '{"string":{"N":"1"}}' --projection-expression "string"

--endpoint-url 'http://localhost:8000' # parameter for local development
# S - String, N - Number, B - Boolean, SS - list of strings

Shell

Date

date # print the current date and time
date +"%Y"
DATE=$(date +"%Y%m%d%H%M")
MONTH=$(date +"%m")

grep

grep -rni "string" * # grep all files in a directory

find

find /foo/bar -name "filename.txt" # find file
find ~ -name *.pdf # find PDF files
   -executable # find an executable file
   -atime -10 # find a file that was accessed less than 10 minutes ago
   -newer reference.txt # find a file edited after the file "reference.txt"

random

echo "body of email" | mail -s "Subject Line" <address@domain.com> # send email
watch -n 1 'cat test.txt' # runs the cat command every second
echo "Hello!" | xargs echo # injects the output of the first command as an argument of the second command
uname -a # shows info about the system (kernel, architecture...)
sudo lshw [-short]# list hardware components
lscpu # show information about the CPU
ps # shows the processes running right now on this terminal session
ps aux # shows all the processes running on the computer
mkdir -p foo/bar/baz # create all necessary directories

awk

awk '{print $0}' somefile # 0 index represents all fields
awk '{print $3}' somefile # print only the 3rd column. columns start at index 1
awk '{$1=$2=""; print $0}' somefile # print all but the first two columns (link)			
awk '{print $2 ", " $3}' somefile # print the 2nd and 3rd columns, separated with a comma
awk '{print $2 + $3}' somefile # print the sum of the 2nd and 3rd columns
awk 'length($0) > 20' somefile # print those lines whose length is longer than 20 characters
awk '$2 > 100' somefile # print those lines where the value of the second column is greater than 100
awk -F '\t' '{print $6}' somefile # use tab as delimiter
awk '{ print $NF "\t" $(NF-2)}' somefile # NF (number of fields) and NR (number of rows)
awk '/hello/{ print "This line contains hello", $0}' somefile # pattern matching
awk '$4~/hello/{ print "This field contains hello", $4}' somefile # field pattern matching
awk '$4 == "0439023483"{ print $6 }' somefile # exact matching
awk '{ s = ""; for (i = 4; i <= NF; i++) s = s $i " "; print s }' somefile # Print all columns from 4 and on
awk '{ printf "%s \t %-5s", $1, substr($2,1,5)}' somefile # use printf and built in functions to format output
awk 'BEGIN { print "start up" } { print "line match" } END { print "tear down" }' somefile
# see more https://earthly.dev/blog/awk-examples/

Docker

exec

docker exec -it <container-id> <command> # sh, npm run test

run

docker run

ps

docker stats $(docker ps --format={{.Names}}) --no-stream
docker ps -a --format '{{.ID}}  {{.Status}}     {{.Names}}  {{.Ports}}'

random

docker stop $(docker ps -a -q) # stop all containers
docker rm $(docker ps -a -q) # remove all containers
docker system prune -a # Remove unused data (docs)
ENV NPM_CONFIG_LOGLEVEL info # set environment variable from Dockerfile
-e NPM_CONFIG_LOGLEVEL=info # set environment variable from docker run
docker system df # show docker disk usage

ImageMagick

dwebp file.webp -o file.jpg

ffmpeg

ffmpeg -i input.mov -vf "setpts=(PTS-STARTPTS)/15" -crf 18 -an output.mov # Speeds up video 15x and removes sound
ffmpeg -i input.mp4 -ss 00:00:30 -t 10 output.mp4 # Crop out a segment of the video from the 30 second mark for a duration of 10 seconds.
ffmpeg -f concat -safe 0 -i <(for f in part*.mp4; do echo "file '$PWD/$f'"; done) -c copy output.mp4 # Concatenate video files				
ffmpeg -i input.mp4 -vcodec libx265 -crf 28 output.mp4 # reduce video size see here 

Random Mac commands

How to Disable Chrome Mac Gestures (Back and Forward)

defaults write com.google.Chrome AppleEnableMouseSwipeNavigateWithScrolls -bool false
defaults write com.google.Chrome AppleEnableSwipeNavigateWithScrolls -bool false

copy/paste from command line

pbcopy
pbpaste

Troubleshooting

# Bring Off-Screen Window Back Onto Screen: Select the application in the dock, then choose "Window" > "Zoom". (source)