Skip to content

Commit

Permalink
Moved muna in wholesale to avoid confusion. Starting updating and inc…
Browse files Browse the repository at this point in the history
…reasing friendliness of output
  • Loading branch information
uriel1998 committed Oct 3, 2024
1 parent 7cadc5b commit b81050a
Show file tree
Hide file tree
Showing 25 changed files with 468 additions and 599 deletions.
23 changes: 6 additions & 17 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
# dangerzone!

### (officially *newsbeuter-dangerzone* )

Enhanced, modular, bookmarking for newsboat, newsbeuter, or (for that matter)
Expand Down Expand Up @@ -136,11 +137,11 @@ and without requiring a great deal of knowledge on the part of the user.
If you create one for another service, please contact me so I can merge it in
(this repository is mirrored multiple places).

### Shorteners

#### murls (shortener)
Some have been removed due to changes in the API that made it impractical (Facebook), and
some (such as the one for Twitter/X) because of API issues and because I don't give a
shit about fascists.

Murls is a free service and does not require an API key.
### Shorteners

#### YOURLS (shortener)

Expand All @@ -152,12 +153,6 @@ Place the URL of your instance and API key into `agaetr.ini`.

### Bookmarking Helpers

#### Twitter via Oysttyer (output)

Posts the title and URL to Twitter using Oysttyer.

Install and set up [oysttyer](https://github.com/oysttyer/oysttyer). Place the
location of the binary into `agaetr.ini` or into your $PATH.

### Shaarli (output)

Expand Down Expand Up @@ -227,11 +222,6 @@ Install and set up [youtube-dl](https://youtube-dl.org/) in your $PATH. Without
editing, these scripts save audio/video into `$HOME/Downloads/mp3` and `$HOME/Downloads/videos`
respectively.

#### Facebook

This helper uses `urlencode` (I got it from the `gridsite-clients` package on Debian) on the link, then
calls `sensible-browser` with the generic Facebook sharing link. User interaction *is* required.

## 6. Content Warning

Currently, content warnings are only used with Mastodon. If you do not wish
Expand Down Expand Up @@ -296,8 +286,7 @@ future updates can't break.

There are other files in this repository:

* `unredirector.sh` - Used by `agaetr` to remove redirections and shortening.
Exactly the same as [muna](https://github.com/uriel1998/muna).
* `muna.sh` - Exactly the same as [muna](https://github.com/uriel1998/muna). Used to remove redirections and shortening.

* `urlportal.sh` - I use this as my "browser" helper. Originally from [Gotbletu](https://github.com/gotbletu/shownotes/blob/master/urlportal.sh),
I added a few things:
Expand Down
16 changes: 0 additions & 16 deletions agaetr.ini
Original file line number Diff line number Diff line change
Expand Up @@ -7,29 +7,13 @@ GlobalCW = RSS-fed
filters =
#filters = politics blog sex bigot supremacist nazi climate
toot = /home/steven/bin/toot
oysttyer = /home/steven/bin/oysttyer.pl
twython = /home/steven/bin/tweet
shaarli = /home/steven/apps/shaarli-client/bin/shaarli
wallabag = /home/steven/apps/wallabag/wallabag
yourls_api=
yourls_site =
bitly_login =
bitly_api =
wayback_access =
wayback_secret =

[Feed1]
url = /path/to/xml/ideatrash_parsed.xml
sensitive = no
ContentWarning = yes
GlobalCW = blogging

[Feed2]
url = http://rss.upi.com/news/news.rss
sensitive = no
ContentWarning = yes
GlobalCW = news

[CW1]
keyword = discrimination
matches = ableism ageism bigot classism diversity homophobia race racism homosexual gay sexism feminism transphobia fatphobia nazi klan supremacist supremacy slavery holocaust deadname bully
Expand Down
8 changes: 6 additions & 2 deletions bookmark.sh
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
#
# This will interactively let you determine where your bookmarks will go for
# newsboat or newsbeuter
# (c) Steven Saus 2020
# (c) Steven Saus 2024
# Licensed under the MIT license
#
##############################################################################
Expand Down Expand Up @@ -45,7 +45,7 @@ export SCRIPT_DIR="$HOME/.newsboat"
cd "${SCRIPT_DIR}"
#Deshortening, deobfuscating, and unredirecting the URL

source "$SCRIPT_DIR/unredirector.sh"
source "$SCRIPT_DIR/muna.sh"
unredirector
link="$url"

Expand Down Expand Up @@ -97,3 +97,7 @@ for p in $posters;do
fi
done

# TODO: add preview function to each of the modules for fzf --preview giving a quick explanation of what it does
# TODO: modules for each possible browser?
# TODO: Allow calling an editor with multiselect capabilities of fzf?
# TODO: Preview current values (and allow editing of) with fzf selection screen
136 changes: 136 additions & 0 deletions muna.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,136 @@
#!/bin/bash

##############################################################################
# muna, by Steven Saus 3 May 2022
# [email protected]
# Licenced under the Apache License
##############################################################################


export SCRIPT_DIR="$(dirname "$(readlink -f "$0")")"
LOUD=0
if [ "$1" == "--verbose" ];then
shift
LOUD=1
fi

function loud() {
if [ $LOUD -eq 1 ];then
echo "$@"
fi
}

# because this is a bash function, it's using the variable $url as the returned
# variable. So there's no real "return" other than setting that var.

function unredirector {
#Explainer/reminder - curl will return 404 error codes *unless* you have
# --fail set, in which case you get the error code. That's done here so
# that it handles 404 and missing server exactly the same way, while
# letting the 300 level codes pass through normally.

headers=$(curl -k -s --fail -m 5 --location -sS --head "$url")
code=$(echo "$headers" | head -1 | awk '{print $2}')

#checks for null as well
if [ -z "$code" ];then
if [ $LOUD -eq 1 ];then
loud "[info] Page/server not found, trying Internet Archive"
fi
firsturl="$url"

#In the JSON the Internet Archive returns, the string
# "archived_snapshots": {}
# is returned if it does not exist in the Archive either.

api_ia=$(curl -s http://archive.org/wayback/available?url="$url")
NotExists=$(echo "$api_ia" | grep -c -e '"archived_snapshots": {}')
if [ "$NotExists" != "0" ];then
SUCCESS=1 #that is, not a success
if [ $LOUD -eq 1 ];then
loud "[error] Web page is gone and not in Internet Archive!"
loud "[error] For page $firsturl"
fi
unset -v $url
unset -v $firsturl
else
if [ $LOUD -eq 1 ];then
loud "[info] Fetching Internet Archive version of"
loud "[info] page $firsturl"
fi
url=$(echo "$api_ia" | awk -F 'url": "' '{print $3}' 2>/dev/null | awk -F '", "' '{print $1}' | awk -F '"' '{print $1}')
unset -v firsturl
fi
else
if echo "$code" | grep -q -e "3[0-9][0-9]";then
if [ $LOUD -eq 1 ];then
loud "[info] HTTP $code redirect"
fi
resulturl=""
resulturl=$(wget -T 5 -O- --server-response "$url" 2>&1 | grep "^Location" | tail -1 | awk -F ' ' '{print $2}')
if [ -z "$resulturl" ]; then
if [ $LOUD -eq 1 ];then
loud "[info] No new location found"
fi
resulturl=$(echo "$url")
else
if [ $LOUD -eq 1 ];then
loud "[info] New location found"
fi
url=$(echo "$resulturl")
if [ $LOUD -eq 1 ];then
loud "[info] REprocessing $url"
fi
headers=$(curl -k -s -m 5 --location -sS --head "$url")
code=$(echo "$headers" | head -1 | awk '{print $2}')
if echo "$code" | grep -q -e "3[0-9][0-9]";then
if [ $LOUD -eq 1 ];then
loud "[info] Second redirect; passing as-is"
fi
fi
fi
fi
if echo "$code" | grep -q -e "2[0-9][0-9]";then
if [ $LOUD -eq 1 ];then
loud "[info] HTTP $code exists"
fi
fi
fi
}

##############################################################################
# Are we sourced?
# From http://stackoverflow.com/questions/2683279/ddg#34642589
##############################################################################

# Try to execute a `return` statement,
# but do it in a sub-shell and catch the results.
# If this script isn't sourced, that will raise an error.
$(return >/dev/null 2>&1)

# What exit code did that give?
if [ "$?" -eq "0" ];then
loud "[info] Function undirector ready to go."
else
if [ "$#" = 0 ];then
echo "Please call this as a function or with the url as the first argument."
exit 99
else
if [ "$1" != "-q" ];then
# backwards compatability
url="$1"
LOUD=0
else
url="$2"
LOUD=1
fi
SUCCESS=0
unredirector
if [ $SUCCESS -eq 0 ];then
# If it gets here, it has to be standalone
echo "$url"
else
exit 99
fi
fi
fi
2 changes: 1 addition & 1 deletion out_avail/add_to_todo.sh
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
##############################################################################
#
# sending script
# (c) Steven Saus 2020
# (c) Steven Saus 2024
# Licensed under the MIT license
#
##############################################################################
Expand Down
78 changes: 55 additions & 23 deletions out_avail/email.sh
Original file line number Diff line number Diff line change
Expand Up @@ -3,41 +3,71 @@
##############################################################################
#
# sending script
# (c) Steven Saus 2020
# (c) Steven Saus 2022
# Licensed under the MIT license
#
##############################################################################

# USING CURL credit
#https://blog.edmdesigner.com/send-email-from-linux-command-line/

function loud() {
if [ $LOUD -eq 1 ];then
echo "$@"
fi
}

# should have been passed in, but just in case...

if [ ! -d "${XDG_DATA_HOME}" ];then
export XDG_DATA_HOME="${HOME}/.local/share"
fi
inifile="${XDG_CONFIG_HOME}/agaetr/agaetr.ini"

function email_send {
if [ -z "${1}" ];then
title="Automated email from agaetr: ${link}"
else
title="${1}"
fi
smtp_server=$(grep 'smtp_server =' "${inifile}" | sed 's/ //g' | awk -F '=' '{print $2}')
smtp_port=$(grep 'smtp_port =' "${inifile}" | sed 's/ //g' | awk -F '=' '{print $2}')
smtp_username=$(grep 'smtp_username =' "${inifile}" | sed 's/ //g' | awk -F '=' '{print $2}')
smtp_password=$(grep 'smtp_password =' "${inifile}" | sed 's/ //g' | awk -F '=' '{print $2}')
email_from=$(grep 'email_from =' "${inifile}" | sed 's/ //g' | awk -F '=' '{print $2}')
raw_emails=$(grep 'email_to =' "${inifile}" | sed 's/ //g' | awk -F '=' '{print $2}')

tmpfile=$(mktemp)
echo "Obtaining text of HTML..."
loud "Obtaining text of HTML..."
echo "${link}" > ${tmpfile}
echo " " >> ${tmpfile}
#wget --connect-timeout=2 --read-timeout=10 --tries=1 -e robots=off -O - "${link}" | pandoc --from=html --to=gfm >> ${tmpfile}

wget --connect-timeout=2 --read-timeout=10 --tries=1 -e robots=off -O - "${link}" | sed -e 's/<img[^>]*>//g' | sed -e 's/<div[^>]*>//g' | hxclean | hxnormalize -e -L -s 2>/dev/null | tidy -quiet -omit -clean 2>/dev/null | hxunent | iconv -t utf-8//TRANSLIT - | sed -e 's/\(<em>\|<i>\|<\/em>\|<\/i>\)/&🞵/g' | sed -e 's/\(<strong>\|<b>\|<\/strong>\|<\/b>\)/&🞶/g' |lynx -dump -stdin -display_charset UTF-8 -width 140 | sed -e 's/\*/•/g' | sed -e 's/Θ/🞵/g' | sed -e 's/Φ/🞯/g' >> ${tmpfile}

# This is from https://github.com/uriel1998/ppl_virdirsyncer_addysearch
addressbook=$(which pplsearch)
if [ -f "$addressbook" ];then
email=$(eval "$addressbook" -m)
fi
if [ -z ${email} ];then
email=$(echo "root@localhost")
fi

binary=$(grep 'mutt =' "$HOME/.config/agaetr/agaetr.ini" | sed 's/ //g' | awk -F '=' '{print $2}')
if [ ! -f "$binary" ];then
binary=$(which mutt)
fi
if [ -f "$binary" ];then
echo "Sending email..."
echo ${tmpfile} | mutt -s "${title}" "${email}"
else
echo "Mutt not found in order to send email!"
fi
# Removed addressbook bit since that doesn't make sense here.

# Split raw CSV of of emails into actual email addresses
OIFS="$IFS"
IFS=';' read -ra email_addresses <<< "${raw_emails}"
IFS="$OIFS"
curl_bin=$(which curl)
for email_addy in "${email_addresses[@]}"
do
# assemble the header
loud "Assembling the header"
tmpfile2=$(mktemp)
echo "To: ${email_addy}" > "${tmpfile2}"
echo "Subject: ${title}" >> "${tmpfile2}"
echo "From: ${email_from}" >> "${tmpfile2}"
echo -e "\n\n" >> "${tmpfile2}"
cat "${tmpfile}" >> "${tmpfile2}"
# assemble the command
loud "Assembling the command for $email_addy."
command_line=$(printf "%s --url \'smtps://%s:%s\' --ssl-reqd --mail-from \'%s\' --mail-rcpt \'%s\' --upload-file %s --user \'%s:%s\'"
"${curl_bin}" "${smtp_server}" "${smtp_port}" "${email_from}" "${email_addy}" "${tmpfile2}" "${smtp_username}" "${smtp_password}")
eval "${command_line}"
rm "${tmpfile2}"
done
rm ${tmpfile}
}

Expand All @@ -64,8 +94,10 @@ else
link="${1}"
if [ ! -z "$2" ];then
title="$2"
else
title="${1}"
fi
email_send
email_send "${title}"
fi
fi

Expand Down
2 changes: 1 addition & 1 deletion out_avail/facebook.sh
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
##############################################################################
#
# sending script
# (c) Steven Saus 2020
# (c) Steven Saus 2024
# Licensed under the MIT license
#
# REQUIRES URLENCODE which is in package gridsite-clients on Debian
Expand Down
2 changes: 1 addition & 1 deletion out_avail/jpg_capture.sh
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
##############################################################################
#
# sending script
# (c) Steven Saus 2020
# (c) Steven Saus 2024
# Licensed under the MIT license
#
##############################################################################
Expand Down
Loading

0 comments on commit b81050a

Please sign in to comment.