This commit is contained in:
lost+skunk 2024-07-13 21:32:04 +03:00
parent 8391cc34a9
commit 667de65e2f
18 changed files with 869 additions and 624 deletions

13
LICENSE Normal file
View File

@ -0,0 +1,13 @@
X11 License
Copyright (C) 1996 X Consortium
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE X CONSORTIUM BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Except as contained in this notice, the name of the X Consortium shall not be used in advertising or otherwise to promote the sale, use or other dealings in this Software without prior written authorization from the X Consortium.
X Window System is a trademark of X Consortium, Inc.

76
README.md Normal file
View File

@ -0,0 +1,76 @@
[![Matrix room](https://img.shields.io/badge/matrix-000000?style=for-the-badge&logo=Matrix&logoColor=white)]("https://go.kde.org/matrix/#/#skunkyart:ebloid.ru")
# Instances
|Инстанс|Yggdrasil|I2P|Tor|NSFW|Proxifying|Country|
|:-----:|:-------:|:-:|:-:|:--:|:--------:|:-----:|
|[skunky.ebloid.ru](https://skunky.ebloid.ru/art)|[Yes](http://[201:eba5:d1fc:bf7b:cfcb:a811:4b8b:7ea3]/art)|No|No| No | No | Russia |
|[clovius.club](https://skunky.clovius.club)|No|No|No| Yes | Yes | Sweden |
|[bloat.cat](https://skunky.bloat.cat)|No|No|No| Yes | Yes | Romania |
|[frontendfriendly.xyz](https://skunkyart.frontendfriendly.xyz)|No|No|No| Yes | Yes | Finland |
# EN 🇺🇸
## Description
SkunkyArt 🦨 -- alternative frontend to DeviantArt, which will work without problems even on quite old hardware, due to the lack of JavaScript.
## Config
The sample config is in the `config.example.json` file. To specify your own path to the config, use the CLI argument `-c` or `--config`.
* `listen` -- the address and port on which SkunkyArt will listen
* `base-path` -- the path to the instance. Example: "`base-path`:"/art/" -> https://skunky.ebloid.ru/art/
* `cache` -- caching system; default is off.
* * `path` -- the path to the cache
* * `lifetime` -- cache file lifetime; measured in Unix milliseconds.
* * `max-size` -- maximum file size in bytes.
* `dirs-to-memory` -- this setting determines which directories will be copied to RAM when SkunkyArt is started. Required
* `download-proxy` -- proxy address for downloading files.
## Examples of reverse proxies
Nginx:
```apache
server {
listen 443 ssl;
server_name skunky.example.com;
location ((BASE URL)) { # if you have a separate subdomain for the frontend, insert '/' instead of '((BASE URL))'.
proxy_set_header Scheme $scheme;
proxy_http_version 1.1;
proxy_pass http://((IP)):((PORT));
}
}
```
## How do I add my instance to the list?
To do this, you must either make a PR by adding your instance to the `instances.json` file, or report it to the room in Matrix. I don't think it needs any description. However, be aware, this list has a couple rules:
1. the instance must not use Cloudflare.
2. If your instance has modified source code, you need to publish it to any free platform. For example, Github and Gitlab are not.
## Acknowledgements
* [Лис⚛](https://go.kde.org/matrix/#/@fox:matrix.org) -- helped me understand Go and gave me a lot of useful advice on this language.
# RU 🇷🇺
## Описание
SkunkyArt 🦨 -- альтернативный фронтенд к DeviantArt, который будет работать без проблем даже на довольно старом оборудовании, за счёт отсутствия JavaScript.
## Конфиг
Пример конфига находится в файле `config.example.json`. Чтобы указать свой путь до конфига, используйте CLI-аргумент `-c` или `--config`.
* `listen` -- адрес и порт, на котором будет слушать SkunkyArt
* `base-path` -- путь к инстансу. Пример: "base-path": "/art/" -> https://skunky.ebloid.ru/art/
* `cache` -- система кеширования; по умолчанию - выкл.
* * `path` -- путь до кеша
* * `lifetime` -- время жизни файла в кеше; измеряется в Unix-миллисекундах
* * `max-size` -- максимальный размер файла в байтах
* `dirs-to-memory` -- данная настройка определяет какие каталоги будут скопированы в ОЗУ при запуске SkunkyArt. Обязательна
* `download-proxy` -- адрес прокси для загрузки файлов
## Примеры reverse-прокси
Nginx:
```apache
server {
listen 443 ssl;
server_name skunky.example.com;
location ((BASE URL)) { # если у вас отдельный поддомен для фронтенда, вместо '((BASE URL))' вставляйте '/'
proxy_set_header Scheme $scheme;
proxy_http_version 1.1;
proxy_pass http://((IP)):((PORT));
}
}
```
## Как добавить свой инстанс в список?
Чтобы это сделать, вы должны либо сделать PR, добавив в файл `instances.json` свой инстанс, либо сообщить о нём в комнате в Matrix. Думаю, он не нуждается в описании. Однако учтите, у этого списка есть пара правил:
1. Инстанс не должен использовать Cloudflare.
2. Если ваш инстанс имеет модифицированный исходный код, то вам нужно опубликовать его на любую свободную площадку. Например, Github и Gitlab таковыми не являются.
## Благодарности
* [Лис⚛](https://go.kde.org/matrix/#/@fox:matrix.org) -- помог разобраться в Go и много чего полезного посоветовал по этому языку.

7
TODO.md Normal file
View File

@ -0,0 +1,7 @@
# v1.3.x
* Доделать парсинг описания
* Реализовать миниатюры и оптимизировать CSS под маленькие экраны
# v1.4
* Реализовать темы
* Реализовать многоязычный интерфейс
* Реализовать API

View File

@ -2,8 +2,8 @@ package app
import (
"encoding/json"
"errors"
"os"
"time"
)
type cache_config struct {
@ -15,13 +15,13 @@ type cache_config struct {
}
type config struct {
cfg string
Listen string
BasePath string `json:"base-path"`
Cache cache_config
Proxy, Nsfw bool
WixmpProxy string `json:"wixmp-proxy"`
TemplatesDir string `json:"templates-dir"`
cfg string
Listen string
BasePath string `json:"base-path"`
Cache cache_config
Proxy, Nsfw bool
DownloadProxy string `json:"download-proxy"`
Dirs []string `json:"dirs-to-memory"`
}
var CFG = config{
@ -33,51 +33,52 @@ var CFG = config{
Path: "cache",
UpdateInterval: 1,
},
TemplatesDir: "html",
Proxy: true,
Nsfw: true,
Dirs: []string{"html", "css"},
Proxy: true,
Nsfw: true,
}
func ExecuteConfig() {
try := func(err error, exitcode int) {
if err != nil {
println(err.Error())
os.Exit(exitcode)
go func() {
for {
Templates["instances.json"] = string(Download("https://git.macaw.me/skunky/SkunkyArt/raw/branch/master/instances.json").Body)
time.Sleep(1 * time.Hour)
}
}
}()
a := os.Args
if l := len(a); l > 1 {
switch a[1] {
case "-c", "--config":
if l >= 3 {
CFG.cfg = a[2]
} else {
try(errors.New("Not enought arguments"), 1)
}
case "-h", "--help":
try(errors.New(`SkunkyArt v1.3 [refactoring]
const helpmsg = `SkunkyArt v1.3 [refactoring]
Usage:
- [-c|--config] - path to config
- [-h|--help] - returns this message
Example:
./skunkyart -c config.json
Copyright lost+skunk, X11. https://git.macaw.me/skunky/skunkyart/src/tag/v1.3`), 0)
default:
try(errors.New("Unreconginzed argument: "+a[1]), 1)
Copyright lost+skunk, X11. https://git.macaw.me/skunky/skunkyart/src/tag/v1.3`
a := os.Args
for n, x := range a {
switch x {
case "-c", "--config":
if len(a) >= 3 {
CFG.cfg = a[n+1]
} else {
exit("Not enought arguments", 1)
}
case "-h", "--help":
exit(helpmsg, 0)
}
if CFG.cfg != "" {
f, err := os.ReadFile(CFG.cfg)
try(err, 1)
}
try(json.Unmarshal(f, &CFG), 1)
if CFG.Cache.Enabled && !CFG.Proxy {
try(errors.New("Incompatible settings detected: cannot use caching media content without proxy"), 1)
}
if CFG.cfg != "" {
f, err := os.ReadFile(CFG.cfg)
try_with_exitstatus(err, 1)
if CFG.Cache.MaxSize != 0 || CFG.Cache.Lifetime != 0 {
go InitCacheSystem()
}
try_with_exitstatus(json.Unmarshal(f, &CFG), 1)
if CFG.Cache.Enabled && !CFG.Proxy {
exit("Incompatible settings detected: cannot use caching media content without proxy", 1)
}
if CFG.Cache.MaxSize != 0 || CFG.Cache.Lifetime != 0 {
go InitCacheSystem()
}
}
}

367
app/parsers.go Normal file
View File

@ -0,0 +1,367 @@
package app
import (
"encoding/json"
"strconv"
"strings"
"git.macaw.me/skunky/devianter"
"golang.org/x/net/html"
)
func (s skunkyart) ParseComments(c devianter.Comments) string {
var cmmts strings.Builder
replied := make(map[int]string)
cmmts.WriteString("<details><summary>Comments: <b>")
cmmts.WriteString(strconv.Itoa(c.Total))
cmmts.WriteString("</b></summary>")
for _, x := range c.Thread {
replied[x.ID] = x.User.Username
cmmts.WriteString(`<div class="msg`)
if x.Parent > 0 {
cmmts.WriteString(` reply`)
}
cmmts.WriteString(`"><p id="`)
cmmts.WriteString(strconv.Itoa(x.ID))
cmmts.WriteString(`"><img src="`)
cmmts.WriteString(UrlBuilder("media", "emojitar", x.User.Username, "?type=a"))
cmmts.WriteString(`" width="30px" height="30px"><a href="`)
cmmts.WriteString(UrlBuilder("group_user", "?q=", x.User.Username, "&type=a"))
cmmts.WriteString(`"><b`)
cmmts.WriteString(` class="`)
if x.User.Banned {
cmmts.WriteString(`banned`)
}
if x.Author {
cmmts.WriteString(`author`)
}
cmmts.WriteString(`">`)
cmmts.WriteString(x.User.Username)
cmmts.WriteString("</b></a> ")
if x.Parent > 0 {
cmmts.WriteString(` In reply to <a href="#`)
cmmts.WriteString(strconv.Itoa(x.Parent))
cmmts.WriteString(`">`)
if replied[x.Parent] == "" {
cmmts.WriteString("???")
} else {
cmmts.WriteString(replied[x.Parent])
}
cmmts.WriteString("</a>")
}
cmmts.WriteString(" [")
cmmts.WriteString(x.Posted.UTC().String())
cmmts.WriteString("]<p>")
cmmts.WriteString(ParseDescription(x.TextContent))
cmmts.WriteString("<p>👍: ")
cmmts.WriteString(strconv.Itoa(x.Likes))
cmmts.WriteString(" ⏩: ")
cmmts.WriteString(strconv.Itoa(x.Replies))
cmmts.WriteString("</p></div>\n")
}
cmmts.WriteString(s.NavBase(DeviationList{
Pages: 0,
More: c.HasMore,
}))
cmmts.WriteString("</details>")
return cmmts.String()
}
func (s skunkyart) DeviationList(devs []devianter.Deviation, content ...DeviationList) string {
var list strings.Builder
if s.Atom && s.Page > 1 {
s.ReturnHTTPError(400)
return ""
} else if s.Atom {
list.WriteString(`<?xml version="1.0" encoding="UTF-8"?><feed xmlns:media="http://search.yahoo.com/mrss/" xmlns="http://www.w3.org/2005/Atom">`)
list.WriteString(`<title>`)
if s.Type == 0 {
list.WriteString("Daily Deviations")
} else if len(devs) != 0 {
list.WriteString(devs[0].Author.Username)
} else {
list.WriteString("SkunkyArt")
}
list.WriteString(`</title>`)
list.WriteString(`<link rel="alternate" href="`)
list.WriteString(Host)
list.WriteString(`"/>`)
} else {
list.WriteString(`<div class="content">`)
}
for _, data := range devs {
if !(data.NSFW && !CFG.Nsfw) {
url := ParseMedia(data.Media)
if s.Atom {
id := strconv.Itoa(data.ID)
list.WriteString(`<entry><author><name>`)
list.WriteString(data.Author.Username)
list.WriteString(`</name></author><title>`)
list.WriteString(data.Title)
list.WriteString(`</title><link rel="alternate" type="text/html" href="`)
list.WriteString(UrlBuilder("post", data.Author.Username, "atom-"+id))
list.WriteString(`"/><id>`)
list.WriteString(id)
list.WriteString(`</id><published>`)
list.WriteString(data.PublishedTime.UTC().Format("Mon, 02 Jan 2006 15:04:05 -0700"))
list.WriteString(`</published>`)
list.WriteString(`<media:group><media:title>`)
list.WriteString(data.Title)
list.WriteString(`</media:title><media:thumbinal url="`)
list.WriteString(url)
list.WriteString(`"/></media:group><content type="xhtml"><div xmlns="http://www.w3.org/1999/xhtml"><a href="`)
list.WriteString(ConvertDeviantArtUrlToSkunkyArt(data.Url))
list.WriteString(`"><img src="`)
list.WriteString(url)
list.WriteString(`"/></a><p>`)
list.WriteString(ParseDescription(data.TextContent))
list.WriteString(`</p></div></content></entry>`)
} else {
list.WriteString(`<div class="block">`)
if url != "" {
list.WriteString(`<a title="open/download" href="`)
list.WriteString(url)
list.WriteString(`"><img loading="lazy" src="`)
list.WriteString(url)
list.WriteString(`" width="15%"></a>`)
} else {
list.WriteString(`<h1>[ TEXT ]</h1>`)
}
list.WriteString(`<br><a href="`)
list.WriteString(ConvertDeviantArtUrlToSkunkyArt(data.Url))
list.WriteString(`">`)
list.WriteString(data.Author.Username)
list.WriteString(" - ")
list.WriteString(data.Title)
// шильдики нсфв, аи и ежедневного поста
if data.NSFW {
list.WriteString(` [<span class="nsfw">NSFW</span>]`)
}
if data.AI {
list.WriteString(" [🤖]")
}
if data.DD {
list.WriteString(` [<span class="dd">DD</span>]`)
}
list.WriteString("</a></div>")
}
}
}
if s.Atom {
list.WriteString("</feed>")
s.Writer.Write([]byte(list.String()))
return ""
} else {
list.WriteString("</div>")
if content != nil {
list.WriteString(s.NavBase(content[0]))
}
}
return list.String()
}
/* DESCRIPTION/COMMENT PARSER */
type text struct {
TXT string
TXT_RAW string
From int
To int
}
func ParseDescription(dscr devianter.Text) string {
var parseddescription strings.Builder
TagBuilder := func(content string, tags ...string) string {
l := len(tags)
for x := 0; x < l; x++ {
var htm strings.Builder
htm.WriteString("<")
htm.WriteString(tags[x])
htm.WriteString(">")
htm.WriteString(content)
htm.WriteString("</")
htm.WriteString(tags[x])
htm.WriteString(">")
content = htm.String()
}
return content
}
DeleteTrackingFromUrl := func(url string) string {
if len(url) > 42 && url[:42] == "https://www.deviantart.com/users/outgoing?" {
url = url[42:]
}
return url
}
if description, dl := dscr.Html.Markup, len(dscr.Html.Markup); dl != 0 &&
description[0] == '{' &&
description[dl-1] == '}' {
var descr struct {
Blocks []struct {
Text, Type string
InlineStyleRanges []struct {
Offset, Length int
Style string
}
EntityRanges []struct {
Offset, Length int
Key int
}
Data struct {
TextAlignment string
}
}
EntityMap map[string]struct {
Type string
Data struct {
Url string
Config struct {
Aligment string
Width int
}
Data devianter.Deviation
}
}
}
e := json.Unmarshal([]byte(description), &descr)
try(e)
entities := make(map[int]devianter.Deviation)
urls := make(map[int]string)
for n, x := range descr.EntityMap {
num, _ := strconv.Atoi(n)
if x.Data.Url != "" {
urls[num] = DeleteTrackingFromUrl(x.Data.Url)
}
entities[num] = x.Data.Data
}
for _, x := range descr.Blocks {
Styles := make([]text, len(x.InlineStyleRanges))
if len(x.InlineStyleRanges) != 0 {
var tags = make(map[int][]string)
for n, rngs := range x.InlineStyleRanges {
Styles := &Styles[n]
switch rngs.Style {
case "BOLD":
rngs.Style = "b"
case "UNDERLINE":
rngs.Style = "u"
case "ITALIC":
rngs.Style = "i"
}
Styles.From = rngs.Offset
Styles.To = rngs.Offset + rngs.Length
FT := Styles.From * Styles.To
tags[FT] = append(tags[FT], rngs.Style)
}
for n := 0; n < len(Styles); n++ {
Styles := &Styles[n]
Styles.TXT_RAW = x.Text[Styles.From:Styles.To]
Styles.TXT = TagBuilder(Styles.TXT_RAW, tags[Styles.From*Styles.To]...)
}
}
switch x.Type {
case "atomic":
d := entities[x.EntityRanges[0].Key]
parseddescription.WriteString(`<a href="`)
parseddescription.WriteString(ConvertDeviantArtUrlToSkunkyArt(d.Url))
parseddescription.WriteString(`"><img width="50%" src="`)
parseddescription.WriteString(ParseMedia(d.Media))
parseddescription.WriteString(`" title="`)
parseddescription.WriteString(d.Author.Username)
parseddescription.WriteString(" - ")
parseddescription.WriteString(d.Title)
parseddescription.WriteString(`"></a>`)
case "unstyled":
if l := len(Styles); l != 0 {
for n, r := range Styles {
var tag string
if x.Type == "header-two" {
tag = "h2"
}
parseddescription.WriteString(x.Text[:r.From])
if len(urls) != 0 && len(x.EntityRanges) != 0 {
ra := &x.EntityRanges[0]
parseddescription.WriteString(`<a target="_blank" href="`)
parseddescription.WriteString(urls[ra.Key])
parseddescription.WriteString(`">`)
parseddescription.WriteString(r.TXT)
parseddescription.WriteString(`</a>`)
} else if l > n+1 {
parseddescription.WriteString(r.TXT)
}
parseddescription.WriteString(TagBuilder(tag, x.Text[r.To:]))
}
} else {
parseddescription.WriteString(x.Text)
}
}
parseddescription.WriteString("<br>")
}
} else if dl != 0 {
for tt := html.NewTokenizer(strings.NewReader(dscr.Html.Markup)); ; {
switch tt.Next() {
case html.ErrorToken:
return parseddescription.String()
case html.StartTagToken, html.EndTagToken, html.SelfClosingTagToken:
token := tt.Token()
switch token.Data {
case "a":
for _, a := range token.Attr {
if a.Key == "href" {
url := DeleteTrackingFromUrl(a.Val)
parseddescription.WriteString(`<a target="_blank" href="`)
parseddescription.WriteString(url)
parseddescription.WriteString(`">`)
parseddescription.WriteString(GetValueOfTag(tt))
parseddescription.WriteString("</a> ")
}
}
case "img":
var uri, title string
for b, a := range token.Attr {
switch a.Key {
case "src":
if len(a.Val) > 9 && a.Val[8:9] == "e" {
uri = UrlBuilder("media", "emojitar", a.Val[37:len(a.Val)-4], "?type=e")
}
case "title":
title = a.Val
}
if title != "" {
for x := -1; x < b; x++ {
parseddescription.WriteString(`<img src="`)
parseddescription.WriteString(uri)
parseddescription.WriteString(`" title="`)
parseddescription.WriteString(title)
parseddescription.WriteString(`">`)
}
}
}
case "br", "li", "ul", "p", "b":
parseddescription.WriteString(token.String())
case "div":
parseddescription.WriteString("<p> ")
}
case html.TextToken:
parseddescription.Write(tt.Text())
}
}
}
return parseddescription.String()
}

View File

@ -4,11 +4,12 @@ import (
"io"
"net/http"
u "net/url"
"os"
"strconv"
"strings"
)
var Host string
func Router() {
parsepath := func(path string) map[int]string {
if l := len(CFG.BasePath); len(path) > l {
@ -43,18 +44,18 @@ func Router() {
// функция, что управляет всем
handle := func(w http.ResponseWriter, r *http.Request) {
path := parsepath(r.URL.Path)
var wr = io.WriteString
open_n_send := func(name string) {
f, e := os.ReadFile(name)
err(e)
wr(w, string(f))
if h := r.Header["Scheme"]; len(h) != 0 && h[0] == "https" {
Host = h[0] + "://" + r.Host
} else {
Host = "http://" + r.Host
}
path := parsepath(r.URL.Path)
// структура с функциями
var skunky skunkyart
skunky.Args = r.URL.Query()
skunky.Writer = w
skunky.Args = r.URL.Query()
skunky.BasePath = CFG.BasePath
arg := skunky.Args.Get
@ -95,18 +96,12 @@ func Router() {
}
case "about":
skunky.About()
case "gui":
case "stylesheet":
w.Header().Add("content-type", "text/css")
open_n_send(next(path, 2))
io.WriteString(w, Templates["css/skunky.css"])
}
}
http.HandleFunc("/", handle)
http.ListenAndServe(CFG.Listen, nil)
}
func err(e error) {
if e != nil {
println(e.Error())
}
}

View File

@ -2,7 +2,6 @@ package app
import (
"encoding/base64"
"encoding/json"
"io"
"net/http"
u "net/url"
@ -17,19 +16,36 @@ import (
"golang.org/x/net/html"
)
// парсинг темплейтов
/* INTERNAL */
func exit(msg string, code int) {
println(msg)
os.Exit(code)
}
func try(e error) {
if e != nil {
println(e.Error())
}
}
func try_with_exitstatus(err error, code int) {
if err != nil {
exit(err.Error(), code)
}
}
// some crap for frontend
func (s skunkyart) ExecuteTemplate(file string, data any) {
var buf strings.Builder
tmp := template.New(file)
tmp, e := tmp.Parse(Templates[file])
err(e)
err(tmp.Execute(&buf, &data))
try(e)
try(tmp.Execute(&buf, &data))
wr(s.Writer, buf.String())
}
func UrlBuilder(strs ...string) string {
var str strings.Builder
l := len(strs)
str.WriteString(Host)
str.WriteString(CFG.BasePath)
for n, x := range strs {
str.WriteString(x)
@ -45,7 +61,7 @@ func (s skunkyart) ReturnHTTPError(status int) {
var msg strings.Builder
msg.WriteString(`<html><link rel="stylesheet" href="`)
msg.WriteString(UrlBuilder("gui", "css", "skunky.css"))
msg.WriteString(UrlBuilder("stylesheet"))
msg.WriteString(`" /><h1>`)
msg.WriteString(strconv.Itoa(status))
msg.WriteString(" - ")
@ -55,7 +71,131 @@ func (s skunkyart) ReturnHTTPError(status int) {
wr(s.Writer, msg.String())
}
func (s skunkyart) ConvertDeviantArtUrlToSkunkyArt(url string) (output string) {
type Downloaded struct {
Headers http.Header
Status int
Body []byte
}
func Download(url string) (d Downloaded) {
cli := &http.Client{}
if CFG.DownloadProxy != "" {
u, e := u.Parse(CFG.DownloadProxy)
try(e)
cli.Transport = &http.Transport{Proxy: http.ProxyURL(u)}
}
req, e := http.NewRequest("GET", url, nil)
try(e)
req.Header.Set("User-Agent", "Mozilla/5.0 (X11; Linux x86_64; rv:123.0) Gecko/20100101 Firefox/123.0.0")
resp, e := cli.Do(req)
try(e)
defer resp.Body.Close()
b, e := io.ReadAll(resp.Body)
try(e)
d.Body = b
d.Status = resp.StatusCode
d.Headers = resp.Header
return
}
// caching
func (s skunkyart) DownloadAndSendMedia(subdomain, path string) {
var url strings.Builder
url.WriteString("https://images-wixmp-")
url.WriteString(subdomain)
url.WriteString(".wixmp.com/")
url.WriteString(path)
url.WriteString("?token=")
url.WriteString(s.Args.Get("token"))
if CFG.Cache.Enabled {
os.Mkdir(CFG.Cache.Path, 0700)
fname := CFG.Cache.Path + "/" + base64.StdEncoding.EncodeToString([]byte(subdomain+path))
file, e := os.Open(fname)
if e != nil {
dwnld := Download(url.String())
if dwnld.Status == 200 && dwnld.Headers["Content-Type"][0][:5] == "image" {
try(os.WriteFile(fname, dwnld.Body, 0700))
s.Writer.Write(dwnld.Body)
}
} else {
file, e := io.ReadAll(file)
try(e)
s.Writer.Write(file)
}
} else if CFG.Proxy {
dwnld := Download(url.String())
s.Writer.Write(dwnld.Body)
} else {
s.Writer.WriteHeader(403)
s.Writer.Write([]byte("Sorry, butt proxy on this instance are disabled."))
}
}
func InitCacheSystem() {
c := &CFG.Cache
for {
dir, e := os.Open(c.Path)
try(e)
stat, e := dir.Stat()
try(e)
dirnames, e := dir.Readdirnames(-1)
try(e)
for _, a := range dirnames {
a = c.Path + "/" + a
if c.Lifetime != 0 {
now := time.Now().UnixMilli()
f, _ := os.Stat(a)
stat := f.Sys().(*syscall.Stat_t)
time := time.Unix(stat.Ctim.Unix()).UnixMilli()
if time+c.Lifetime <= now {
try(os.RemoveAll(a))
}
}
if c.MaxSize != 0 && stat.Size() > c.MaxSize {
try(os.RemoveAll(a))
}
}
dir.Close()
time.Sleep(time.Second * time.Duration(CFG.Cache.UpdateInterval))
}
}
func CopyTemplatesToMemory() {
for _, dirname := range CFG.Dirs {
dir, e := os.ReadDir(dirname)
try_with_exitstatus(e, 1)
for _, x := range dir {
n := dirname + "/" + x.Name()
file, e := os.ReadFile(n)
try_with_exitstatus(e, 1)
Templates[n] = string(file)
}
}
}
/* PARSING HELPERS */
func ParseMedia(media devianter.Media) string {
url := devianter.UrlFromMedia(media)
if len(url) != 0 && CFG.Proxy {
url = url[21:]
dot := strings.Index(url, ".")
return UrlBuilder("media", "file", url[:dot], url[dot+11:])
}
return url
}
func ConvertDeviantArtUrlToSkunkyArt(url string) (output string) {
if len(url) > 32 && url[27:32] != "stash" {
url = url[27:]
toart := strings.Index(url, "/art/")
@ -78,13 +218,7 @@ func BuildUserPlate(name string) string {
return htm.String()
}
type text struct {
TXT string
from int
to int
}
func tagval(t *html.Tokenizer) string {
func GetValueOfTag(t *html.Tokenizer) string {
for tt := t.Next(); ; {
switch tt {
default:
@ -95,198 +229,14 @@ func tagval(t *html.Tokenizer) string {
}
}
func ParseDescription(dscr devianter.Text) string {
var parseddescription strings.Builder
TagBuilder := func(tag string, content string) string {
if tag != "" {
var htm strings.Builder
htm.WriteString("<")
htm.WriteString(tag)
htm.WriteString(">")
htm.WriteString(content)
htm.WriteString("</")
htm.WriteString(tag)
htm.WriteString(">")
return htm.String()
}
return content
}
DeleteSpywareFromUrl := func(url string) string {
if len(url) > 42 && url[:42] == "https://www.deviantart.com/users/outgoing?" {
url = url[42:]
}
return url
}
if description, dl := dscr.Html.Markup, len(dscr.Html.Markup); dl != 0 &&
description[0] == '{' &&
description[dl-1] == '}' {
var descr struct {
Blocks []struct {
Text, Type string
InlineStyleRanges []struct {
Offset, Length int
Style string
}
EntityRanges []struct {
Offset, Length int
Key int
}
Data struct {
TextAlignment string
}
}
EntityMap map[string]struct {
Type string
Data struct {
Url string
Config struct {
Aligment string
Width int
}
Data devianter.Deviation
}
}
}
e := json.Unmarshal([]byte(description), &descr)
err(e)
entities := make(map[int]devianter.Deviation)
urls := make(map[int]string)
for n, x := range descr.EntityMap {
num, _ := strconv.Atoi(n)
if x.Data.Url != "" {
urls[num] = DeleteSpywareFromUrl(x.Data.Url)
}
entities[num] = x.Data.Data
}
for _, x := range descr.Blocks {
ranges := make(map[int]text)
for i, rngs := range x.InlineStyleRanges {
var tag string
switch rngs.Style {
case "BOLD":
tag = "b"
case "UNDERLINE":
tag = "u"
case "ITALIC":
tag = "i"
}
fromto := rngs.Offset + rngs.Length
ranges[i] = text{
TXT: TagBuilder(tag, x.Text[rngs.Offset:fromto]),
from: rngs.Offset,
to: fromto,
}
}
switch x.Type {
case "atomic":
d := entities[x.EntityRanges[0].Key]
parseddescription.WriteString(`<img width="50%" src="`)
parseddescription.WriteString(ParseMedia(d.Media))
parseddescription.WriteString(`" title="`)
parseddescription.WriteString(d.Author.Username)
parseddescription.WriteString(" - ")
parseddescription.WriteString(d.Title)
parseddescription.WriteString(`">`)
case "unstyled":
if len(ranges) != 0 {
for _, r := range ranges {
var tag string
switch x.Type {
case "header-two":
tag = "h2"
}
parseddescription.WriteString(x.Text[:r.from])
if len(urls) != 0 && len(x.EntityRanges) != 0 {
ra := &x.EntityRanges[0]
parseddescription.WriteString(`<a target="_blank" href="`)
parseddescription.WriteString(urls[ra.Key])
parseddescription.WriteString(`">`)
parseddescription.WriteString(r.TXT)
parseddescription.WriteString(`</a>`)
} else {
parseddescription.WriteString(r.TXT)
}
parseddescription.WriteString(TagBuilder(tag, x.Text[r.to:]))
}
} else {
parseddescription.WriteString(x.Text)
}
}
parseddescription.WriteString("<br>")
}
} else if dl != 0 {
for tt := html.NewTokenizer(strings.NewReader(dscr.Html.Markup)); ; {
switch tt.Next() {
case html.ErrorToken:
return parseddescription.String()
case html.StartTagToken, html.EndTagToken, html.SelfClosingTagToken:
token := tt.Token()
switch token.Data {
case "a":
for _, a := range token.Attr {
if a.Key == "href" {
url := DeleteSpywareFromUrl(a.Val)
parseddescription.WriteString(`<a target="_blank" href="`)
parseddescription.WriteString(url)
parseddescription.WriteString(`">`)
parseddescription.WriteString(tagval(tt))
parseddescription.WriteString("</a> ")
}
}
case "img":
var uri, title string
for b, a := range token.Attr {
switch a.Key {
case "src":
if len(a.Val) > 9 && a.Val[8:9] == "e" {
uri = UrlBuilder("media", "emojitar", a.Val[37:len(a.Val)-4], "?type=e")
}
case "title":
title = a.Val
}
if title != "" {
for x := -1; x < b; x++ {
parseddescription.WriteString(`<img src="`)
parseddescription.WriteString(uri)
parseddescription.WriteString(`" title="`)
parseddescription.WriteString(title)
parseddescription.WriteString(`">`)
}
}
}
case "br", "li", "ul", "p", "b":
parseddescription.WriteString(token.String())
case "div":
parseddescription.WriteString("<p> ")
}
case html.TextToken:
parseddescription.Write(tt.Text())
}
}
}
return parseddescription.String()
}
// навигация по страницам
type dlist struct {
type DeviationList struct {
Pages int
More bool
}
// FIXME: на некоротрых артах первая страница может вызывать полное отсутствие панели навигации.
func (s skunkyart) NavBase(c dlist) string {
func (s skunkyart) NavBase(c DeviationList) string {
// TODO: сделать понятнее
// навигация по страницам
var list strings.Builder
@ -333,7 +283,7 @@ func (s skunkyart) NavBase(c dlist) string {
}
// вперёд
for x := p; x <= p+6; x++ {
for x := p; x <= p+6 && c.Pages > p+6; x++ {
if x == p {
prevrev("", x, true)
x++
@ -346,277 +296,9 @@ func (s skunkyart) NavBase(c dlist) string {
}
// вперёд-назад
if p != 417 || c.More {
if c.More {
prevrev("| Next =>", p+1, false)
}
return list.String()
}
func (s skunkyart) DeviationList(devs []devianter.Deviation, content ...dlist) string {
var list strings.Builder
if s.Atom && s.Page > 1 {
s.ReturnHTTPError(400)
return ""
} else if s.Atom {
list.WriteString(`<?xml version="1.0" encoding="UTF-8"?><feed xmlns:media="http://search.yahoo.com/mrss/" xmlns="http://www.w3.org/2005/Atom">`)
list.WriteString(`<title>SkunkyArt</title>`)
// list.WriteString(`<link rel="alternate" href="HOMEPAGE_URL"/><link href="FEED_URL" rel="self"/>`)
} else {
list.WriteString(`<div class="content">`)
}
for _, data := range devs {
if !(data.NSFW && !CFG.Nsfw) {
url := ParseMedia(data.Media)
if s.Atom {
id := strconv.Itoa(data.ID)
list.WriteString(`<entry><author><name>`)
list.WriteString(data.Author.Username)
list.WriteString(`</name></author><title>`)
list.WriteString(data.Title)
list.WriteString(`</title><link rel="alternate" type="text/html" href="`)
list.WriteString(UrlBuilder("post", data.Author.Username, "atom-"+id))
list.WriteString(`"/><id>`)
list.WriteString(id)
list.WriteString(`</id><published>`)
list.WriteString(data.PublishedTime.UTC().Format("Mon, 02 Jan 2006 15:04:05 -0700"))
list.WriteString(`</published>`)
list.WriteString(`<media:group><media:title>`)
list.WriteString(data.Title)
list.WriteString(`</media:title><media:thumbinal url="`)
list.WriteString(url)
list.WriteString(`"/></media:group><content type="xhtml"><div xmlns="http://www.w3.org/1999/xhtml"><a href="`)
list.WriteString(data.Url)
list.WriteString(`"><img src="`)
list.WriteString(url)
list.WriteString(`"/></a><p>`)
list.WriteString(ParseDescription(data.TextContent))
list.WriteString(`</p></div></content></entry>`)
} else {
list.WriteString(`<div class="block">`)
if url != "" {
list.WriteString(`<a title="open/download" href="`)
list.WriteString(url)
list.WriteString(`"><img loading="lazy" src="`)
list.WriteString(url)
list.WriteString(`" width="15%"></a>`)
} else {
list.WriteString(`<h1>[ TEXT ]</h1>`)
}
list.WriteString(`<br><a href="`)
list.WriteString(s.ConvertDeviantArtUrlToSkunkyArt(data.Url))
list.WriteString(`">`)
list.WriteString(data.Author.Username)
list.WriteString(" - ")
list.WriteString(data.Title)
// шильдики нсфв, аи и ежедневного поста
if data.NSFW {
list.WriteString(` [<span class="nsfw">NSFW</span>]`)
}
if data.AI {
list.WriteString(" [🤖]")
}
if data.DD {
list.WriteString(` [<span class="dd">DD</span>]`)
}
list.WriteString("</a></div>")
}
}
}
if s.Atom {
list.WriteString("</feed>")
s.Writer.Write([]byte(list.String()))
return ""
} else {
list.WriteString("</div>")
if content != nil {
list.WriteString(s.NavBase(content[0]))
}
}
return list.String()
}
func (s skunkyart) ParseComments(c devianter.Comments) string {
var cmmts strings.Builder
replied := make(map[int]string)
cmmts.WriteString("<details><summary>Comments: <b>")
cmmts.WriteString(strconv.Itoa(c.Total))
cmmts.WriteString("</b></summary>")
for _, x := range c.Thread {
replied[x.ID] = x.User.Username
cmmts.WriteString(`<div class="msg`)
if x.Parent > 0 {
cmmts.WriteString(` reply`)
}
cmmts.WriteString(`"><p id="`)
cmmts.WriteString(strconv.Itoa(x.ID))
cmmts.WriteString(`"><img src="`)
cmmts.WriteString(UrlBuilder("media", "emojitar", x.User.Username, "?type=a"))
cmmts.WriteString(`" width="30px" height="30px"><a href="`)
cmmts.WriteString(UrlBuilder("group_user", "?q=", x.User.Username, "&type=a"))
cmmts.WriteString(`"><b`)
cmmts.WriteString(` class="`)
if x.User.Banned {
cmmts.WriteString(`banned`)
}
if x.Author {
cmmts.WriteString(`author`)
}
cmmts.WriteString(`">`)
cmmts.WriteString(x.User.Username)
cmmts.WriteString("</b></a> ")
if x.Parent > 0 {
cmmts.WriteString(` In reply to <a href="#`)
cmmts.WriteString(strconv.Itoa(x.Parent))
cmmts.WriteString(`">`)
if replied[x.Parent] == "" {
cmmts.WriteString("???")
} else {
cmmts.WriteString(replied[x.Parent])
}
cmmts.WriteString("</a>")
}
cmmts.WriteString(" [")
cmmts.WriteString(x.Posted.UTC().String())
cmmts.WriteString("]<p>")
cmmts.WriteString(ParseDescription(x.TextContent))
cmmts.WriteString("<p>👍: ")
cmmts.WriteString(strconv.Itoa(x.Likes))
cmmts.WriteString(" ⏩: ")
cmmts.WriteString(strconv.Itoa(x.Replies))
cmmts.WriteString("</p></div>\n")
}
cmmts.WriteString(s.NavBase(dlist{
Pages: 0,
More: c.HasMore,
}))
cmmts.WriteString("</details>")
return cmmts.String()
}
func ParseMedia(media devianter.Media) string {
url := devianter.UrlFromMedia(media)
if len(url) != 0 {
url = url[21:]
dot := strings.Index(url, ".")
return UrlBuilder("media", "file", url[:dot], "/", url[dot+10:])
}
return ""
}
func (s skunkyart) DownloadAndSendMedia(subdomain, path string) {
var url strings.Builder
url.WriteString("https://images-wixmp-")
url.WriteString(subdomain)
url.WriteString(".wixmp.com/")
url.WriteString(path)
url.WriteString("?token=")
url.WriteString(s.Args.Get("token"))
download := func() (body []byte, status int, headers http.Header) {
cli := &http.Client{}
if CFG.WixmpProxy != "" {
u, e := u.Parse(CFG.WixmpProxy)
err(e)
cli.Transport = &http.Transport{Proxy: http.ProxyURL(u)}
}
req, e := http.NewRequest("GET", url.String(), nil)
err(e)
req.Header.Set("User-Agent", "Mozilla/5.0 (X11; Linux x86_64; rv:123.0) Gecko/20100101 Firefox/123.0.0")
resp, e := cli.Do(req)
err(e)
defer resp.Body.Close()
b, e := io.ReadAll(resp.Body)
err(e)
return b, resp.StatusCode, resp.Header
}
if CFG.Cache.Enabled {
os.Mkdir(CFG.Cache.Path, 0700)
fname := CFG.Cache.Path + "/" + base64.StdEncoding.EncodeToString([]byte(subdomain+path))
file, e := os.Open(fname)
if e != nil {
b, status, headers := download()
if status == 200 && headers["Content-Type"][0][:5] == "image" {
err(os.WriteFile(fname, b, 0700))
s.Writer.Write(b)
}
} else {
file, e := io.ReadAll(file)
err(e)
s.Writer.Write(file)
}
} else if CFG.Proxy {
b, _, _ := download()
s.Writer.Write(b)
} else {
s.Writer.WriteHeader(403)
s.Writer.Write([]byte("Sorry, butt proxy on this instance disabled."))
}
}
func InitCacheSystem() {
c := &CFG.Cache
for {
dir, e := os.Open(c.Path)
err(e)
stat, e := dir.Stat()
err(e)
dirnames, e := dir.Readdirnames(-1)
err(e)
for _, a := range dirnames {
a = c.Path + "/" + a
rm := func() {
err(os.RemoveAll(a))
}
if c.Lifetime != 0 {
now := time.Now().UnixMilli()
f, _ := os.Stat(a)
stat := f.Sys().(*syscall.Stat_t)
time := time.Unix(stat.Ctim.Unix()).UnixMilli()
if time+c.Lifetime <= now {
rm()
}
}
if c.MaxSize != 0 && stat.Size() > c.MaxSize {
rm()
}
}
dir.Close()
time.Sleep(time.Second * time.Duration(CFG.Cache.UpdateInterval))
}
}
func CopyTemplatesToMemory() {
try := func(e error) {
if e != nil {
panic(e.Error())
}
}
dir, e := os.ReadDir(CFG.TemplatesDir)
try(e)
for _, x := range dir {
n := CFG.TemplatesDir + "/" + x.Name()
file, e := os.ReadFile(n)
try(e)
Templates[n] = string(file)
}
}

View File

@ -1,6 +1,7 @@
package app
import (
"encoding/json"
"io"
"net/http"
"net/url"
@ -17,17 +18,33 @@ var wr = io.WriteString
var Templates = make(map[string]string)
type skunkyart struct {
Writer http.ResponseWriter
Writer http.ResponseWriter
Args url.Values
BasePath string
Type rune
Query, QueryRaw string
Page int
Atom bool
Templates struct {
Templates struct {
About struct {
Proxy bool
Nsfw bool
Proxy bool
Nsfw bool
Instances []struct {
Title string
Country string
Urls []struct {
I2P string `json:"i2p"`
Ygg string
Tor string
Clearnet string
}
Settings struct {
Nsfw bool
Proxy bool
}
}
}
SomeList string
@ -127,7 +144,7 @@ func (s skunkyart) GRUser() {
case "cover_deviation":
group.About.BGMeta = x.ModuleData.CoverDeviation.Deviation
group.About.BGMeta.Url = s.ConvertDeviantArtUrlToSkunkyArt(group.About.BGMeta.Url)
group.About.BGMeta.Url = ConvertDeviantArtUrlToSkunkyArt(group.About.BGMeta.Url)
group.About.BG = ParseMedia(group.About.BGMeta.Media)
case "group_admins":
var htm strings.Builder
@ -146,7 +163,7 @@ func (s skunkyart) GRUser() {
gallery := g.Gallery(s.Page, folderid)
if folderid > 0 {
group.Gallery.List = s.DeviationList(gallery.Content.Results, dlist{
group.Gallery.List = s.DeviationList(gallery.Content.Results, DeviationList{
More: gallery.Content.HasMore,
})
} else {
@ -157,13 +174,18 @@ func (s skunkyart) GRUser() {
for _, x := range x.ModuleData.Folders.Results {
folders.WriteString(`<div class="block folder-item">`)
folders.WriteString(`<a href="`)
folders.WriteString(s.ConvertDeviantArtUrlToSkunkyArt(x.Thumb.Url))
folders.WriteString(`"><img loading="lazy" src="`)
folders.WriteString(ParseMedia(x.Thumb.Media))
folders.WriteString(`" title="`)
folders.WriteString(x.Thumb.Title)
folders.WriteString(`"></a><br>`)
if !(x.Thumb.NSFW && !CFG.Nsfw) {
folders.WriteString(`<a href="`)
folders.WriteString(ConvertDeviantArtUrlToSkunkyArt(x.Thumb.Url))
folders.WriteString(`"><img loading="lazy" src="`)
folders.WriteString(ParseMedia(x.Thumb.Media))
folders.WriteString(`" title="`)
folders.WriteString(x.Thumb.Title)
folders.WriteString(`"></a>`)
} else {
folders.WriteString(`<h1>[ <span class="nsfw">NSFW</span> ]</h1>`)
}
folders.WriteString("<br>")
folders.WriteString(`<a href="?folder=`)
folders.WriteString(strconv.Itoa(x.FolderId))
@ -182,7 +204,7 @@ func (s skunkyart) GRUser() {
}
if x.Name == "folder_deviations" {
group.Gallery.List = s.DeviationList(x