Wildcard to only match certain files ?
So I am trying to create a wildcard that will only match certain files but I am not sure how to do so. In my dir the files look like this
10.0.0.1.csv
10.0.0.2.csv
10.0.0.3.csv
10.0.0.4.csv
10.0.0.5.csv
10.0.0.6.csv
10.0.0.7.csv
10.0.0.8.csv
10.0.0.9.csv
10.0.0.10.csv
toplist.csv
usage.csv
trial.csv
I only want to match the files with the IP addresses in their names , but the IP address in the filename may change. How can I accomplish this ?
https://redd.it/186tv00
@r_bash
So I am trying to create a wildcard that will only match certain files but I am not sure how to do so. In my dir the files look like this
10.0.0.1.csv
10.0.0.2.csv
10.0.0.3.csv
10.0.0.4.csv
10.0.0.5.csv
10.0.0.6.csv
10.0.0.7.csv
10.0.0.8.csv
10.0.0.9.csv
10.0.0.10.csv
toplist.csv
usage.csv
trial.csv
I only want to match the files with the IP addresses in their names , but the IP address in the filename may change. How can I accomplish this ?
https://redd.it/186tv00
@r_bash
Reddit
From the bash community on Reddit
Explore this post and more from the bash community
A single function to extract all archive file types at once: tar{gz,tgz,xz,lz,bz2} + zip and 7z
####################################################################################
##
## How to demonstrate:
##
### 1) Add all of the following types of archives to an empty directory
## - name.tar.gz
## - name.tar.tgz
## - name.tar.xz
## - name.tar.bz2
## - name.tar.lz
## - name.zip
## - name.7z
##
## 2) Add the untar function to your .bash_functions or .bashrc file and source it or restart your terminal
##
## 3) Simply type "untar" in the same directory as the above archive files
## - Second method is to run this noscript in the same direcotry with the archive files using the command "bash untar.sh"
##
## Result: All archives will be extracted and ready to use
##
####################################################################################
You need 7z installed for the 7z and zip archive files and tar command for the tar files.
Also for the lz tar files you can install it using `sudo apt install lzip`
If you like you can use my GitHub 7zip noscript to instantly install the latest 7zip version 23.01.
bash <(curl -fsSL https://7z.optimizethis.net)
[GitHub Script: 7-zip installer](https://github.com/slyfox1186/noscript-repo/blob/main/Bash/Installer%20Scripts/SlyFox1186%20Scripts/build-7zip)
I hope you guys think it's worthy!
Cheers!
[GitHub Script: untar.sh](https://github.com/slyfox1186/noscript-repo/blob/main/Bash/Misc/untar.sh)
https://redd.it/186zwdp
@r_bash
####################################################################################
##
## How to demonstrate:
##
### 1) Add all of the following types of archives to an empty directory
## - name.tar.gz
## - name.tar.tgz
## - name.tar.xz
## - name.tar.bz2
## - name.tar.lz
## - name.zip
## - name.7z
##
## 2) Add the untar function to your .bash_functions or .bashrc file and source it or restart your terminal
##
## 3) Simply type "untar" in the same directory as the above archive files
## - Second method is to run this noscript in the same direcotry with the archive files using the command "bash untar.sh"
##
## Result: All archives will be extracted and ready to use
##
####################################################################################
You need 7z installed for the 7z and zip archive files and tar command for the tar files.
Also for the lz tar files you can install it using `sudo apt install lzip`
If you like you can use my GitHub 7zip noscript to instantly install the latest 7zip version 23.01.
bash <(curl -fsSL https://7z.optimizethis.net)
[GitHub Script: 7-zip installer](https://github.com/slyfox1186/noscript-repo/blob/main/Bash/Installer%20Scripts/SlyFox1186%20Scripts/build-7zip)
I hope you guys think it's worthy!
Cheers!
[GitHub Script: untar.sh](https://github.com/slyfox1186/noscript-repo/blob/main/Bash/Misc/untar.sh)
https://redd.it/186zwdp
@r_bash
Bash projects
What are some bash projects that you would think as a “trophy” in a resume/ or that have some potential in general (I’m mostly interested in blue team/ red team related projects).
https://redd.it/186zu8j
@r_bash
What are some bash projects that you would think as a “trophy” in a resume/ or that have some potential in general (I’m mostly interested in blue team/ red team related projects).
https://redd.it/186zu8j
@r_bash
Reddit
From the bash community on Reddit
Explore this post and more from the bash community
How to return lines from a JSON log file which contain matched IPs and timestamps values from a CSV?
IP and timestamp values exists in some way in the line entries of the JSON log file, and if so, return that specific JSON log entry to another file. I tried to make it universal so it's applicable to all IP addresses. Here's what the sample CSV file would look like;
"clientip,""destip","desthostname","timestamp"
"127.0.0.1","0.0.0.0","randomhost","2023-09-09T04:18:22.542Z"
And a sample line entry from the Json Log File
{"log": "09-Sept-2023 rate-limit: info: client @xyz 127.0.0.1, "stream":"stderr", "time": 2023-09-09T04:18:22.542Z"}
It's the lines from the JSON log file we want to return in the output.txt file when there's a match. The JSON file doesn't have the same fields and organization like the CSV does (with clientip, destip, dest\hostname, timestamp, but I was hoping that I could still at least return lines from the JSON log files to a new file that had matches on the clientip (like we see here with 127.0.0.1 in "info: client @xyz 127.0.0.1) and maybe the timestamp.
I tried using the
https://redd.it/1877qzz
@r_bash
IP and timestamp values exists in some way in the line entries of the JSON log file, and if so, return that specific JSON log entry to another file. I tried to make it universal so it's applicable to all IP addresses. Here's what the sample CSV file would look like;
"clientip,""destip","desthostname","timestamp"
"127.0.0.1","0.0.0.0","randomhost","2023-09-09T04:18:22.542Z"
And a sample line entry from the Json Log File
{"log": "09-Sept-2023 rate-limit: info: client @xyz 127.0.0.1, "stream":"stderr", "time": 2023-09-09T04:18:22.542Z"}
It's the lines from the JSON log file we want to return in the output.txt file when there's a match. The JSON file doesn't have the same fields and organization like the CSV does (with clientip, destip, dest\hostname, timestamp, but I was hoping that I could still at least return lines from the JSON log files to a new file that had matches on the clientip (like we see here with 127.0.0.1 in "info: client @xyz 127.0.0.1) and maybe the timestamp.
I tried using the
join command like, join file.csv xyz-json.log > output.txt . Andawk did not yield much either.https://redd.it/1877qzz
@r_bash
Reddit
From the bash community on Reddit
Explore this post and more from the bash community
set -e, but show error message and line-number
I usually use
Unfortunately, you only get a non-zero exit value if a command fails, but no warning message.
Is there a simple way to get and error message and the line-number, if
a bash noscript stops because a command returned a non-zero status?
https://redd.it/187fr0c
@r_bash
I usually use
set -e in my bash noscripts.Unfortunately, you only get a non-zero exit value if a command fails, but no warning message.
Is there a simple way to get and error message and the line-number, if
a bash noscript stops because a command returned a non-zero status?
https://redd.it/187fr0c
@r_bash
Reddit
From the bash community on Reddit
Explore this post and more from the bash community
incremental Backup
Hey
working on a rsync noscript to create incremental backups. I’ve used many different sources around the net including chatgpt to get to where I’m at. I understand approx 75% of whats going on and I’ve run to the point where I’m not grasping why somethings are the way they are... The noscript creates three directories: daily, weekly, monthly. each of these has sub directories ie, (daily)2023-11-30-02:00, 2023-11-29-02:00, etc (weekly)46,47,48. The noscript is supposed to delete backups past a defined date (I’ll use “daily” as example here on out) $DAILY_RETENTION_DAYS. TDirectory name correctly corresponds to the date the directory was actually created but the mtime is the date I first ran the noscript which is nov 9. All the directories are Nov 9 mtime regardless of when they were actually created. I assume this is due to how links operate. Why are the directories mtime off?
!/bin/bash
# debugging
export PS4='${LINENO}: ' #Prints out the line number being executed by debug
set -xv #Turn on debugging
######### Change Directory as Needed - Must Match Source Dir Name ###########
DESTOPT="MyFiles"
# Define source and destination servers and directories
SOURCESERVER="username@192.168.100.55"
SOURCEDIR="/srv/dev-disk-by-uuid-532c2087-63ea-4d82-9194-b2d49e371aa9/$DESTOPT"
DESTSERVER="localhost"
DESTROOT="/media/Storage01"
DESTDIR="$DESTROOT/$DESTOPT"
# Define backup retention periods in days
DAILYRETENTIONDAYS=7
WEEKLYRETENTIONDAYS=30
MONTHLYRETENTIONDAYS=365
# Define current date/time for use in backup directory name
CURRENTDATE=
CURRENTWEEK=`date "+%V"`
CURRENTMONTH=
# Define logs
if -d $DEST_ROOT/log ; then
echo "log directory exists"
else
mkdir -p $DESTROOT/log
fi
LOGDIR="$DESTROOT/log"
LOGFILE="$DESTOPT"-"$CURRENTDATE.log"
# Define rsync options
RSYNCOPTS="-rlthv --exclude=.Trash* --delete-after --itemize-changes --link-dest=$DESTDIR/current --log-file=$LOGDIR/$LOGFILE"
# Define notification threshold
DAILYTHRESHOLD=20
WEEKLYTHRESHOLD=100
MONTHLYTHRESHOLD=400
# Create backup directories on local destination server
mkdir -p "$DESTDIR/daily/$CURRENTDATE"
mkdir -p "$DESTDIR/weekly/$CURRENTWEEK"
mkdir -p "$DESTDIR/monthly/$CURRENTMONTH"
mkdir -p "$DESTDIR/current"
# Run rsync to perform daily backup
RSYNCOUTPUT=$(rsync $RSYNCOPTS $SOURCESERVER:$SOURCEDIR/ $DESTDIR/daily/$CURRENTDATE/)
# Count number of files changed
DAILYNUMCHANGED=$(echo "$RSYNCOUTPUT" | grep -c "^[\>\<cd]")
# Send email notification if number of files changed exceeds threshold
if [[ $DAILYNUMCHANGED -gt $DAILYTHRESHOLD ]]; then
sendmail -t < /home/user/Mail/backupthresholdexceeded.txt
fi
# Remove daily backups older than retention period
# 11-13-23 commented out as results are usually deleting all directories
#find $DESTDIR/daily -maxdepth 1 -type d -mtime +$DAILYRETENTIONDAYS -exec rm -rf {} \;
# Update symlink to most recent daily backup
rm -rf $DESTDIR/current && ln -s $DESTDIR/daily/$CURRENTDATE $DESTDIR/current
#
# Run rsync to perform weekly backup on Monday
if [ `date +%u` -eq 1 ]; then
rsyncoutput=$(rsync $RSYNCOPTS $SOURCESERVER:$SOURCEDIR/ $DESTDIR/weekly/$CURRENTWEEK/ 2>&1)
# Count number of files changed
WEEKLYNUMCHANGED=$(echo "$rsyncoutput" | grep -E '^sent|^total size' | awk '{print $NF}' | paste -sd+ - | bc)
# Remove weekly backups older than retention period
# find $DESTDIR/weekly -maxdepth 1 -type d -mtime +$WEEKLYRETENTIONDAYS -exec rm -rf {} \;
# Update symlink to most recent weekly backup
rm -f $DESTDIR/weekly/current && ln -s $DESTDIR/weekly/$CURRENTWEEK
Hey
working on a rsync noscript to create incremental backups. I’ve used many different sources around the net including chatgpt to get to where I’m at. I understand approx 75% of whats going on and I’ve run to the point where I’m not grasping why somethings are the way they are... The noscript creates three directories: daily, weekly, monthly. each of these has sub directories ie, (daily)2023-11-30-02:00, 2023-11-29-02:00, etc (weekly)46,47,48. The noscript is supposed to delete backups past a defined date (I’ll use “daily” as example here on out) $DAILY_RETENTION_DAYS. TDirectory name correctly corresponds to the date the directory was actually created but the mtime is the date I first ran the noscript which is nov 9. All the directories are Nov 9 mtime regardless of when they were actually created. I assume this is due to how links operate. Why are the directories mtime off?
!/bin/bash
# debugging
export PS4='${LINENO}: ' #Prints out the line number being executed by debug
set -xv #Turn on debugging
######### Change Directory as Needed - Must Match Source Dir Name ###########
DESTOPT="MyFiles"
# Define source and destination servers and directories
SOURCESERVER="username@192.168.100.55"
SOURCEDIR="/srv/dev-disk-by-uuid-532c2087-63ea-4d82-9194-b2d49e371aa9/$DESTOPT"
DESTSERVER="localhost"
DESTROOT="/media/Storage01"
DESTDIR="$DESTROOT/$DESTOPT"
# Define backup retention periods in days
DAILYRETENTIONDAYS=7
WEEKLYRETENTIONDAYS=30
MONTHLYRETENTIONDAYS=365
# Define current date/time for use in backup directory name
CURRENTDATE=
date "+%Y-%m-%d-%H:%M"CURRENTWEEK=`date "+%V"`
CURRENTMONTH=
date "+%m"# Define logs
if -d $DEST_ROOT/log ; then
echo "log directory exists"
else
mkdir -p $DESTROOT/log
fi
LOGDIR="$DESTROOT/log"
LOGFILE="$DESTOPT"-"$CURRENTDATE.log"
# Define rsync options
RSYNCOPTS="-rlthv --exclude=.Trash* --delete-after --itemize-changes --link-dest=$DESTDIR/current --log-file=$LOGDIR/$LOGFILE"
# Define notification threshold
DAILYTHRESHOLD=20
WEEKLYTHRESHOLD=100
MONTHLYTHRESHOLD=400
# Create backup directories on local destination server
mkdir -p "$DESTDIR/daily/$CURRENTDATE"
mkdir -p "$DESTDIR/weekly/$CURRENTWEEK"
mkdir -p "$DESTDIR/monthly/$CURRENTMONTH"
mkdir -p "$DESTDIR/current"
# Run rsync to perform daily backup
RSYNCOUTPUT=$(rsync $RSYNCOPTS $SOURCESERVER:$SOURCEDIR/ $DESTDIR/daily/$CURRENTDATE/)
# Count number of files changed
DAILYNUMCHANGED=$(echo "$RSYNCOUTPUT" | grep -c "^[\>\<cd]")
# Send email notification if number of files changed exceeds threshold
if [[ $DAILYNUMCHANGED -gt $DAILYTHRESHOLD ]]; then
sendmail -t < /home/user/Mail/backupthresholdexceeded.txt
fi
# Remove daily backups older than retention period
# 11-13-23 commented out as results are usually deleting all directories
#find $DESTDIR/daily -maxdepth 1 -type d -mtime +$DAILYRETENTIONDAYS -exec rm -rf {} \;
# Update symlink to most recent daily backup
rm -rf $DESTDIR/current && ln -s $DESTDIR/daily/$CURRENTDATE $DESTDIR/current
#
# Run rsync to perform weekly backup on Monday
if [ `date +%u` -eq 1 ]; then
rsyncoutput=$(rsync $RSYNCOPTS $SOURCESERVER:$SOURCEDIR/ $DESTDIR/weekly/$CURRENTWEEK/ 2>&1)
# Count number of files changed
WEEKLYNUMCHANGED=$(echo "$rsyncoutput" | grep -E '^sent|^total size' | awk '{print $NF}' | paste -sd+ - | bc)
# Remove weekly backups older than retention period
# find $DESTDIR/weekly -maxdepth 1 -type d -mtime +$WEEKLYRETENTIONDAYS -exec rm -rf {} \;
# Update symlink to most recent weekly backup
rm -f $DESTDIR/weekly/current && ln -s $DESTDIR/weekly/$CURRENTWEEK
$DESTDIR/weekly/current
# Send email notification if number of files changed exceeds threshold
if [[ $WEEKLYNUMCHANGED -gt $NOTIFICATIONTHRESHOLD ]]; then
sendmail -t < /home/user/Mail/backupthresholdexceeded.txt
fi
fi
#
# Run rsync to perform monthly backup on 1st day of month
if `date +%d` -eq 1 ; then
rsyncoutput=$(rsync $RSYNCOPTS $SOURCESERVER:$SOURCEDIR/ $DESTDIR/monthly/$CURRENTMONTH/ 2>&1)
# Count number of files changed
MONTHLYNUMCHANGED=$(echo "$rsyncoutput" | grep -E '^sent|^total size' | awk '{print $NF}' | paste -sd+ - | bc)
# Remove monthly backup older than retention period
# find $DESTDIR/monthly -maxdepth 1 -type d -mtime +$MONTHLYRETENTIONDAYS -exec rm -rf {} \;
# Update symlink to most rescent monthly backup
rm -f $DESTDIR/monthly/current && ln -s $DESTDIR/monthly/$CURRENTMONTH $DESTDIR/monthly/current
# Send email notification if number of files changed exceeds threshold
if [ $MONTHLY_NUM_CHNAGED -gt $NOTIFICATION_THRESHOLD ]; then
sendmail -t < /home/user/Mail/backupthresholdexceeded.txt
fi
fi
# Turn off debugging
set +xv
ls -la
user@backup:/media/Storage01/myfiles/daily$ ls -la
total 72
drwxrwxr-x 18 user user 4096 Nov 30 01:00 .
drwxrwxr-x 5 user user 4096 Nov 30 01:00 ..
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-15-09:25
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-17-01:00
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-18-01:00
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-19-01:00
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-20-01:00
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-20-14:07
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-21-01:00
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-22-01:00
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-23-01:00
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-24-01:00
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-25-01:00
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-26-01:00
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-27-01:00
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-28-01:00
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-29-01:00
Notice I have commented out the lines to delete directories past the defined retention time as it deletes all the directories because mtime is Nov 9th. How do I fix this?
https://redd.it/187p2ly
@r_bash
# Send email notification if number of files changed exceeds threshold
if [[ $WEEKLYNUMCHANGED -gt $NOTIFICATIONTHRESHOLD ]]; then
sendmail -t < /home/user/Mail/backupthresholdexceeded.txt
fi
fi
#
# Run rsync to perform monthly backup on 1st day of month
if `date +%d` -eq 1 ; then
rsyncoutput=$(rsync $RSYNCOPTS $SOURCESERVER:$SOURCEDIR/ $DESTDIR/monthly/$CURRENTMONTH/ 2>&1)
# Count number of files changed
MONTHLYNUMCHANGED=$(echo "$rsyncoutput" | grep -E '^sent|^total size' | awk '{print $NF}' | paste -sd+ - | bc)
# Remove monthly backup older than retention period
# find $DESTDIR/monthly -maxdepth 1 -type d -mtime +$MONTHLYRETENTIONDAYS -exec rm -rf {} \;
# Update symlink to most rescent monthly backup
rm -f $DESTDIR/monthly/current && ln -s $DESTDIR/monthly/$CURRENTMONTH $DESTDIR/monthly/current
# Send email notification if number of files changed exceeds threshold
if [ $MONTHLY_NUM_CHNAGED -gt $NOTIFICATION_THRESHOLD ]; then
sendmail -t < /home/user/Mail/backupthresholdexceeded.txt
fi
fi
# Turn off debugging
set +xv
ls -la
user@backup:/media/Storage01/myfiles/daily$ ls -la
total 72
drwxrwxr-x 18 user user 4096 Nov 30 01:00 .
drwxrwxr-x 5 user user 4096 Nov 30 01:00 ..
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-15-09:25
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-17-01:00
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-18-01:00
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-19-01:00
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-20-01:00
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-20-14:07
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-21-01:00
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-22-01:00
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-23-01:00
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-24-01:00
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-25-01:00
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-26-01:00
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-27-01:00
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-28-01:00
drwxrwxr-x 13 user user 4096 Nov 9 16:30 2023-11-29-01:00
Notice I have commented out the lines to delete directories past the defined retention time as it deletes all the directories because mtime is Nov 9th. How do I fix this?
https://redd.it/187p2ly
@r_bash
Reddit
From the bash community on Reddit
Explore this post and more from the bash community
String comparison unexpected results?
I have the following noscript:
if sqlite3 webapp/instance/app.db "SELECT EXISTS(SELECT 1 FROM users WHERE username='admin');" > /dev/null
then
read -p "admin user exists, overwrite? " answer
if "$answer" = "y" ;
then
echo "Removing old admin account..."
sed -i '/ADMINPASSWORD/d' .env
sqlite3 webapp/instance/app.db "DELETE FROM users WHERE username='admin';"
ADMINPWMAYBE=ADMINPASSWORD
else
ADMINPWMAYBE=NULLPASSWORD
fi
fi
I run this and it goes as follows:
admin user exists, overwrite? y
There is no output and it doesn't delete the old ADMIN\PASSWORD
I have added
so what am I doing wrong?
https://redd.it/187meau
@r_bash
I have the following noscript:
if sqlite3 webapp/instance/app.db "SELECT EXISTS(SELECT 1 FROM users WHERE username='admin');" > /dev/null
then
read -p "admin user exists, overwrite? " answer
if "$answer" = "y" ;
then
echo "Removing old admin account..."
sed -i '/ADMINPASSWORD/d' .env
sqlite3 webapp/instance/app.db "DELETE FROM users WHERE username='admin';"
ADMINPWMAYBE=ADMINPASSWORD
else
ADMINPWMAYBE=NULLPASSWORD
fi
fi
I run this and it goes as follows:
admin user exists, overwrite? y
There is no output and it doesn't delete the old ADMIN\PASSWORD
I have added
echo "$answer" and it is indeed "y"so what am I doing wrong?
https://redd.it/187meau
@r_bash
Reddit
From the bash community on Reddit
Explore this post and more from the bash community
grep to output only needed capturing group
https://shnoscripts.com/grep-to-output-only-needed-capturing-group/
https://redd.it/187tjxd
@r_bash
https://shnoscripts.com/grep-to-output-only-needed-capturing-group/
https://redd.it/187tjxd
@r_bash
#!/bin/bash
grep to output only needed capturing group
Say you have some text and some pattern that you want to provide for grep. Everything is easy, until you want to extract only the pattern-matching part, not the whole line that has the match. What to do?
You can "grep" and use pipe for further processing…
You can "grep" and use pipe for further processing…
Calculating with Logs in Bash...
I think BC can do it, or maybe EXPR, but can't find enough documentation or examples even.
I want to calculate this formula and display a result in a noscript I am building...
N = Log2 (S^L)
It's for calculating the password strength of a given password.
I have S and I have L, i need to calculate N. Short of generating Log tables and storing them in an array, I am stuck in finding an elegant solution.
Here are the notes I have received on how it works...
\----
**Password Entropy**
Password entropy is a measure of the randomness or unpredictability of a password. It is often expressed in bits and gives an indication of the strength of a password against brute-force attacks. The formula to calculate password entropy is:
\[ \\text{Entropy} = \\log\2(\\text{Number of Possible Combinations}) \]
Where:
(\\text{Entropy}) is the password entropy in bits.
( \\log_2 ) is the base-2 logarithm.
(\\text{Number of Possible Combinations}) is the total number of possible combinations of the characters used in the password.
The formula takes into account the length of the password and the size of the character set.
Here's a step-by-step guide to calculating password entropy:
Determine the Character Set:
Identify the character set used in the password. This includes uppercase letters, lowercase letters, numbers, and special characters.
Calculate the Size of the Character Set ((S)):
Add up the number of characters in the character set.
Determine the Password Length ((L)):
Identify the length of the password.
Calculate the Number of Possible Combinations ((N)):
Raise the size of the character set ((S)) to the power of the password length ((L)). \[ N = S\^L \]
Calculate the Entropy ((\\text{Entropy})):
Take the base-2 logarithm of the number of possible combinations ((N)). [ \\text{Entropy} = \\log_2(N) \]
This entropy value gives an indication of the strength of the password. Generally, higher entropy values indicate stronger passwords that are more resistant to brute-force attacks. Keep in mind that the actual strength of a password also depends on other factors, such as the effectiveness of the password generation method and the randomness of the chosen characters.
https://redd.it/18804ax
@r_bash
I think BC can do it, or maybe EXPR, but can't find enough documentation or examples even.
I want to calculate this formula and display a result in a noscript I am building...
N = Log2 (S^L)
It's for calculating the password strength of a given password.
I have S and I have L, i need to calculate N. Short of generating Log tables and storing them in an array, I am stuck in finding an elegant solution.
Here are the notes I have received on how it works...
\----
**Password Entropy**
Password entropy is a measure of the randomness or unpredictability of a password. It is often expressed in bits and gives an indication of the strength of a password against brute-force attacks. The formula to calculate password entropy is:
\[ \\text{Entropy} = \\log\2(\\text{Number of Possible Combinations}) \]
Where:
(\\text{Entropy}) is the password entropy in bits.
( \\log_2 ) is the base-2 logarithm.
(\\text{Number of Possible Combinations}) is the total number of possible combinations of the characters used in the password.
The formula takes into account the length of the password and the size of the character set.
Here's a step-by-step guide to calculating password entropy:
Determine the Character Set:
Identify the character set used in the password. This includes uppercase letters, lowercase letters, numbers, and special characters.
Calculate the Size of the Character Set ((S)):
Add up the number of characters in the character set.
Determine the Password Length ((L)):
Identify the length of the password.
Calculate the Number of Possible Combinations ((N)):
Raise the size of the character set ((S)) to the power of the password length ((L)). \[ N = S\^L \]
Calculate the Entropy ((\\text{Entropy})):
Take the base-2 logarithm of the number of possible combinations ((N)). [ \\text{Entropy} = \\log_2(N) \]
This entropy value gives an indication of the strength of the password. Generally, higher entropy values indicate stronger passwords that are more resistant to brute-force attacks. Keep in mind that the actual strength of a password also depends on other factors, such as the effectiveness of the password generation method and the randomness of the chosen characters.
https://redd.it/18804ax
@r_bash
Reddit
From the bash community on Reddit
Explore this post and more from the bash community
Trying to understand what the period does in a regular expression/wild card?
I'm a little confused about the difference here:
This is true (returns 0 exit code): [[ "aramel" =\~ c* \]\]; echo $?
This is false (returns 1): [[ "aramel" =\~ c.* \]\]; echo $?
(notice the period after the 'c' in the latter.
​
Now, both will return a success if instead of writing 'aramel' we writ the word 'caramel'.
So it seems like one of them requires the c and the other does not?
​
I've been asking and searching but I am still unsure about what specifically the dot affects.
https://redd.it/185ti39
@r_bash
I'm a little confused about the difference here:
This is true (returns 0 exit code): [[ "aramel" =\~ c* \]\]; echo $?
This is false (returns 1): [[ "aramel" =\~ c.* \]\]; echo $?
(notice the period after the 'c' in the latter.
​
Now, both will return a success if instead of writing 'aramel' we writ the word 'caramel'.
So it seems like one of them requires the c and the other does not?
​
I've been asking and searching but I am still unsure about what specifically the dot affects.
https://redd.it/185ti39
@r_bash
Reddit
From the bash community on Reddit
Explore this post and more from the bash community
Basename breaks everything somehow
\#!/bin/bash
baseFolder=\~/gameconverter/psx
​
outputFolder=\~/gameconverter/output
rm -rf ${outputFolder}
mkdir ${outputFolder}
​
nonSplitted=$(find ${baseFolder} -maxdepth 1 -not -regex ".*(Disc [12\]).*")
disc1=$(find ${baseFolder} -maxdepth 1 -regex ".*(Disc [1\]).*")
disc2=$(find ${baseFolder} -maxdepth 1 -regex ".*(Disc [2\]).*")
​
for game in "${nonSplitted[@\]}"; do
gameName=$(basename "${game}")
input="${baseFolder}/${gameName}/${gameName}.cue"
output="${outputFolder}/${gameName}.chd"
chdman createcd -i "${input}" -o "${output}"
done
There are at least 20 psx games in basefolder but somehow, if I use basename or ##*/, it only runs once but with echo ${game} shows all the subfolders, why?
https://redd.it/188m8ch
@r_bash
\#!/bin/bash
baseFolder=\~/gameconverter/psx
​
outputFolder=\~/gameconverter/output
rm -rf ${outputFolder}
mkdir ${outputFolder}
​
nonSplitted=$(find ${baseFolder} -maxdepth 1 -not -regex ".*(Disc [12\]).*")
disc1=$(find ${baseFolder} -maxdepth 1 -regex ".*(Disc [1\]).*")
disc2=$(find ${baseFolder} -maxdepth 1 -regex ".*(Disc [2\]).*")
​
for game in "${nonSplitted[@\]}"; do
gameName=$(basename "${game}")
input="${baseFolder}/${gameName}/${gameName}.cue"
output="${outputFolder}/${gameName}.chd"
chdman createcd -i "${input}" -o "${output}"
done
There are at least 20 psx games in basefolder but somehow, if I use basename or ##*/, it only runs once but with echo ${game} shows all the subfolders, why?
https://redd.it/188m8ch
@r_bash
Reddit
From the bash community on Reddit
Explore this post and more from the bash community
Getting "read -p" to work within do loop reading files
I'm trying to read a file into my noscript, and prompt for input between each read. When I execute it, the prompt does not occur and only two lines are printed. Removing the "read yn" line means all the files.txt lines do print.
user@local ~/code/bash/interactivefilecopy> source pl.sh
in loop file001.txt
in loop file003.txt
user@local ~/code/bash/interactivefilecopy> cat pl.sh
while read -r linein; do
echo in loop $linein
read -p "whatever" yn
done <files.txt
user@local ~/code/bash/interactivefilecopy> cat files.txt
file001.txt
file002.txt
file003.txt
What am I doing wrong?
Thank you in advance.
https://redd.it/188ltyx
@r_bash
I'm trying to read a file into my noscript, and prompt for input between each read. When I execute it, the prompt does not occur and only two lines are printed. Removing the "read yn" line means all the files.txt lines do print.
user@local ~/code/bash/interactivefilecopy> source pl.sh
in loop file001.txt
in loop file003.txt
user@local ~/code/bash/interactivefilecopy> cat pl.sh
while read -r linein; do
echo in loop $linein
read -p "whatever" yn
done <files.txt
user@local ~/code/bash/interactivefilecopy> cat files.txt
file001.txt
file002.txt
file003.txt
What am I doing wrong?
Thank you in advance.
https://redd.it/188ltyx
@r_bash
Reddit
From the bash community on Reddit
Explore this post and more from the bash community
Functions and Libraries
So...... I have started moving all my snips of valuable functions that I use into a bunch of library files which I will make available to anyone who wants. The hardest part is documenting everything so it is actually useful.
My next step, once this step is done, is two make a "bash make" tool, that scans your noscript, scans the libraries that are called using `source` and then builds a single file containing only what is needed. Single file is easier for distribution.
BUT!!!! I have a question: Some of my functions from abc.lib.sh are needed in xyz.lib.sh as well as getting used by mainnoscript.sh. The kicker comes in that if I `source abc.lib.sh` in both the other files, the function loads twice which causes an error.
I can do a test before the source command to see if it is already loaded. I just want to know what is common practice for sequence of events.
I am currently doing:
1. declare statements
2. source statements
3. functions
4. main code
https://redd.it/18ae4xk
@r_bash
So...... I have started moving all my snips of valuable functions that I use into a bunch of library files which I will make available to anyone who wants. The hardest part is documenting everything so it is actually useful.
My next step, once this step is done, is two make a "bash make" tool, that scans your noscript, scans the libraries that are called using `source` and then builds a single file containing only what is needed. Single file is easier for distribution.
BUT!!!! I have a question: Some of my functions from abc.lib.sh are needed in xyz.lib.sh as well as getting used by mainnoscript.sh. The kicker comes in that if I `source abc.lib.sh` in both the other files, the function loads twice which causes an error.
I can do a test before the source command to see if it is already loaded. I just want to know what is common practice for sequence of events.
I am currently doing:
1. declare statements
2. source statements
3. functions
4. main code
https://redd.it/18ae4xk
@r_bash
Reddit
From the bash community on Reddit
Explore this post and more from the bash community
Help needed with my noscript
So basically i made simple neofetch, and i want to add ascii on left of the text.
i was thinking about creating something like list of ascii that i can change
github repo with code: https://github.com/Kotuu3/shfetch
https://redd.it/18akcqk
@r_bash
So basically i made simple neofetch, and i want to add ascii on left of the text.
i was thinking about creating something like list of ascii that i can change
github repo with code: https://github.com/Kotuu3/shfetch
https://redd.it/18akcqk
@r_bash
GitHub
GitHub - Kotuu3/shfetch: Simple yet effective bash linux fetch
Simple yet effective bash linux fetch. Contribute to Kotuu3/shfetch development by creating an account on GitHub.
Why learn bash if there's python?
I don't get it. Python is inherently a simpler language(in terms of syntax mainly which obviously makes it easier to learn and stuffs).
Why should I learn bash when I can do everything else in python?
https://redd.it/18amkcn
@r_bash
I don't get it. Python is inherently a simpler language(in terms of syntax mainly which obviously makes it easier to learn and stuffs).
Why should I learn bash when I can do everything else in python?
https://redd.it/18amkcn
@r_bash
Reddit
From the bash community on Reddit
Explore this post and more from the bash community
Backup database with rsync to remote server, how?
My brain is stuck trying to figure out how to do this with rsync in a noscript.
This is what I have so far, and its just very wrong:
docker compose exec -T database pg_dump -U user teslamate | rsync -az 192.168.1.100:/media/user/backup/teslamate/"teslamate.bck.$(date)
any suggestions?
https://redd.it/18aobj0
@r_bash
My brain is stuck trying to figure out how to do this with rsync in a noscript.
This is what I have so far, and its just very wrong:
docker compose exec -T database pg_dump -U user teslamate | rsync -az 192.168.1.100:/media/user/backup/teslamate/"teslamate.bck.$(date)
any suggestions?
https://redd.it/18aobj0
@r_bash
Reddit
From the bash community on Reddit
Explore this post and more from the bash community
Essential Bash Keyboard Shortcuts to Speed Up Your Workflow
https://linuxiac.com/essential-bash-keyboard-shortcuts/
https://redd.it/18av3w9
@r_bash
https://linuxiac.com/essential-bash-keyboard-shortcuts/
https://redd.it/18av3w9
@r_bash
Linuxiac
Essential Bash Keyboard Shortcuts to Speed Up Your Workflow
Unleash the power of Bash shortcuts in Linux! Streamline your command line experience with these powerful, easy-to-learn tips.
Bash noscript with ffmpeg uses 71 GB memory, how do I fix it?
​
https://preview.redd.it/zyznrwd4oe4c1.png?width=1330&format=png&auto=webp&s=cf9d3e9f41de3a4274b6a7183760f70f632fd74f
​
* [I am trying to generate a mosaic from 4 input videos using filter complex](https://trac.ffmpeg.org/wiki/Create%20a%20mosaic%20out%20of%20several%20input%20videos)
* The mosaic is generated inside a for loop because start time is different in each iteration on the mosaic
* Then I cut a part of the video as specified by timestamps in array (basically 4 parts) and finally join them
* Then delete the intermediate stuff
* [Here is the bash noscript that does all this](https://pastebin.com/vB9GMMj4)
* When tested on 640x480 the whole thing works perfectly
* When run on 1920x1080, first 2 iterations work well, 3rd iteration gets a KILLED 9 error meaning I guess it ran out of memory
* I am on Apple M1 16GB Sonoma 14 if that helps
* How can I resolve this?
https://redd.it/18b3yg6
@r_bash
​
https://preview.redd.it/zyznrwd4oe4c1.png?width=1330&format=png&auto=webp&s=cf9d3e9f41de3a4274b6a7183760f70f632fd74f
​
* [I am trying to generate a mosaic from 4 input videos using filter complex](https://trac.ffmpeg.org/wiki/Create%20a%20mosaic%20out%20of%20several%20input%20videos)
* The mosaic is generated inside a for loop because start time is different in each iteration on the mosaic
* Then I cut a part of the video as specified by timestamps in array (basically 4 parts) and finally join them
* Then delete the intermediate stuff
* [Here is the bash noscript that does all this](https://pastebin.com/vB9GMMj4)
* When tested on 640x480 the whole thing works perfectly
* When run on 1920x1080, first 2 iterations work well, 3rd iteration gets a KILLED 9 error meaning I guess it ran out of memory
* I am on Apple M1 16GB Sonoma 14 if that helps
* How can I resolve this?
https://redd.it/18b3yg6
@r_bash
Detecting hot keys and inserting text into stdin
Right now i have a program using a simple loop to get input:
I would like to be able to detect a hotkey combination, such as
to be clear, i don't want to edit the input after getting it, but instead add text into the input. for example how some programs automatically add a closing bracket when you enter an opening one.
is this possible? where should i look? i have already looked around and can't find anything for the inserting text.
https://redd.it/18bfn2z
@r_bash
Right now i have a program using a simple loop to get input:
while true
do
read -ep "" string
echo "Text Entered: $string"
done
I would like to be able to detect a hotkey combination, such as
ctrl+1, and from that, insert into the currently edited stdin.to be clear, i don't want to edit the input after getting it, but instead add text into the input. for example how some programs automatically add a closing bracket when you enter an opening one.
is this possible? where should i look? i have already looked around and can't find anything for the inserting text.
https://redd.it/18bfn2z
@r_bash
Reddit
From the bash community on Reddit
Explore this post and more from the bash community
Help creating AWK noscript that compares values ?
I need to create a awk noscript that does the following. Finds the specific Max value and Min value for a specific hour found in files that represent a day. So I have a directory with files multiple files like this
day1.csv
day2.csv
day3.csv
day4.csv
day5.csv
day6.csv
day.7.csv
Here is an example of what a few lines in each file would look like
Google,2023-11-30T04:00:00.000Z,38322391063,dev1
Google,2023-11-30T05:00:00.000Z,17091898399,dev1
Google,2023-11-30T06:00:00.000Z,19000746641,dev1
Facebook,2023-11-30T04:00:00.000Z,38322391063,dev1
Facebook,2023-11-30T05:00:00.000Z,17091898399,dev1
Facebook,2023-11-30T06:00:00.000Z,19000746641,dev1
if we take a look at the second field 2023-11-30T04:00:00.000Z , 04 = 4th hour.
So what I am wondering is how would I go about going through all the day files , getting the lines with the specific hours and their first field (Protocol) and 3rd field (usage). So for example if I wanted to see the Max and Min value for the 4th hour for Google over the 7 day files , I would have to find each line in each file that has Google as a first field and contains 04 as the hour in its date field and then store its 3rd field , the usage , and then compare all those lines to see which is the Highest value and which is the lowest. If someone could help me understand how to do this with awk I would really appreciate it.
https://redd.it/18bk8h2
@r_bash
I need to create a awk noscript that does the following. Finds the specific Max value and Min value for a specific hour found in files that represent a day. So I have a directory with files multiple files like this
day1.csv
day2.csv
day3.csv
day4.csv
day5.csv
day6.csv
day.7.csv
Here is an example of what a few lines in each file would look like
Google,2023-11-30T04:00:00.000Z,38322391063,dev1
Google,2023-11-30T05:00:00.000Z,17091898399,dev1
Google,2023-11-30T06:00:00.000Z,19000746641,dev1
Facebook,2023-11-30T04:00:00.000Z,38322391063,dev1
Facebook,2023-11-30T05:00:00.000Z,17091898399,dev1
Facebook,2023-11-30T06:00:00.000Z,19000746641,dev1
if we take a look at the second field 2023-11-30T04:00:00.000Z , 04 = 4th hour.
So what I am wondering is how would I go about going through all the day files , getting the lines with the specific hours and their first field (Protocol) and 3rd field (usage). So for example if I wanted to see the Max and Min value for the 4th hour for Google over the 7 day files , I would have to find each line in each file that has Google as a first field and contains 04 as the hour in its date field and then store its 3rd field , the usage , and then compare all those lines to see which is the Highest value and which is the lowest. If someone could help me understand how to do this with awk I would really appreciate it.
https://redd.it/18bk8h2
@r_bash
Reddit
From the bash community on Reddit
Explore this post and more from the bash community