1
1
mirror of https://github.com/MarginaliaSearch/MarginaliaSearch.git synced 2025-10-06 07:32:38 +02:00

Compare commits

...

48 Commits

Author SHA1 Message Date
Viktor Lofgren
10b6a25c63 (nsfw) Fix SQL error on duplicate domains 2025-06-11 00:11:26 +02:00
Viktor
5a8e286689 Merge pull request #204 from MarginaliaSearch/vlofgren-patch-1
Update ROADMAP.md
2025-06-07 14:01:13 +02:00
Viktor
39a055aa94 Update ROADMAP.md 2025-06-07 14:01:01 +02:00
Viktor Lofgren
37aaa90dc9 (deploy) Clean up deploy script 2025-06-07 13:43:56 +02:00
Viktor
24022c5adc Merge pull request #203 from MarginaliaSearch/nsfw-domain-lists
Nsfw blocking via UT1 domain lists
2025-06-07 13:24:05 +02:00
Viktor Lofgren
1de9ecc0b6 (nsfw) Add metrics to the filtering so we can monitor it 2025-06-07 13:17:05 +02:00
Viktor Lofgren
9b80245ea0 (nsfw) Move filtering to the IndexApiClient, and add filtering options to the internal APIs and public API. 2025-06-07 12:54:20 +02:00
Viktor Lofgren
4e1595c1a6 (nsfw) Initial work on adding UT1-based domain filtering 2025-06-06 14:23:37 +02:00
Viktor Lofgren
0be8585fa5 Add tag format hint to deploy script 2025-06-06 10:03:18 +02:00
Viktor Lofgren
a0fe070fe7 Redeploy browserless and assistant. 2025-06-06 09:51:39 +02:00
Viktor Lofgren
abe9da0fc6 (search) Ensure the new search UI sets the correct content-type for opensearch.xml 2025-05-29 12:44:55 +02:00
Viktor Lofgren
56d0128b0a (dom-sample) Remove redundant code 2025-05-28 17:43:46 +02:00
Viktor Lofgren
840b68ac55 (dom-sample) Minor cleanups 2025-05-28 16:27:27 +02:00
Viktor Lofgren
c34ff6d6c3 (dom-sample) Use WAL journal for dom sample db 2025-05-28 16:16:28 +02:00
Viktor Lofgren
32780967d8 (dom-sample) Initialize dom sampler 2025-05-28 16:06:05 +02:00
Viktor Lofgren
7330bc489d (deploy) Correct deploy script for browserless 2025-05-28 15:58:12 +02:00
Viktor Lofgren
ea23f33738 (deploy) Correct deploy script for headlesschrome 2025-05-28 15:56:05 +02:00
Viktor Lofgren
4a8a028118 (deploy) Deploy assistant and browserless 2025-05-28 15:50:26 +02:00
Viktor
a25bc647be Merge pull request #201 from MarginaliaSearch/website-capture
Capture website snapshots
2025-05-28 15:49:03 +02:00
Viktor Lofgren
a720dba3a2 (deploy) Add browserless to deploy script 2025-05-28 15:48:32 +02:00
Viktor Lofgren
284f382867 (dom-sample) Fix initialization to work the same as screenshot capture 2025-05-28 15:40:09 +02:00
Viktor Lofgren
a80717f138 (dom-sample) Cleanup 2025-05-28 15:32:54 +02:00
Viktor Lofgren
d6da715fa4 (dom-sample) Add basic retrieval logic
First iteration is single threaded for simplicity
2025-05-28 15:18:15 +02:00
Viktor Lofgren
c1ec7aa491 (dom-sample) Add a boolean to the sample db when we've accepted a cookie dialogue 2025-05-28 14:45:19 +02:00
Viktor Lofgren
3daf37e283 (dom-sample) Improve storage of DOM sample data 2025-05-28 14:34:34 +02:00
Viktor Lofgren
44a774d3a8 (browserless) Add --pull option to Docker build command
This ensures we fetch the latest base image when we build.
2025-05-28 14:09:32 +02:00
Viktor Lofgren
597aeaf496 (website-capture) Correct manifest
run_at is set at the content_script level, not the root object.
2025-05-28 14:05:16 +02:00
Viktor Lofgren
06df7892c2 (website-capture) Clean up code 2025-05-27 15:56:59 +02:00
Viktor Lofgren
dc26854268 (website-capture) Add a marker to the network log when we've accepted a cookie dialog 2025-05-27 15:21:02 +02:00
Viktor Lofgren
9f16326cba (website-capture) Add logic that automatically identifies and agrees to cookie consent popovers
Oftentimes, ads don't load until after you've agreed to the popover.
2025-05-27 15:11:47 +02:00
Viktor Lofgren
ed66d0b3a7 (website-capture) Amend the extension to also capture web request information 2025-05-26 14:00:43 +02:00
Viktor Lofgren
c3afc82dad (website-capture) Rename scripts to be more consistent with extension terminology 2025-05-26 13:13:11 +02:00
Viktor Lofgren
08e25e539e (website-capture) Minor cleanups 2025-05-21 14:55:03 +02:00
Viktor Lofgren
4946044dd0 (website-capture) Update BrowserlesClient to use the new image 2025-05-21 14:14:18 +02:00
Viktor Lofgren
edf382e1c5 (website-capture) Add a custom docker image with a new custom extension for DOM capture
The original approach of injecting javascript into the page directly didn't work with pages that reloaded themselves.  To work around this, a chrome extension is used instead that does the same work, but subscribes to reload events and re-installs the change listener.
2025-05-21 14:13:54 +02:00
Viktor Lofgren
644cba32e4 (website-capture) Remove dead imports 2025-05-20 16:08:48 +02:00
Viktor Lofgren
34b76390b2 (website-capture) Add storage object for DOM samples 2025-05-20 16:05:54 +02:00
Viktor Lofgren
43cd507971 (crawler) Add a migration workaround so we can still open old slop crawl data with the new column added 2025-05-19 14:47:38 +02:00
Viktor Lofgren
cc40e99fdc (crawler) Add a migration workaround so we can still open old slop crawl data with the new column added 2025-05-19 14:37:59 +02:00
Viktor Lofgren
8a944cf4c6 (crawler) Add request time to crawl data
This is an interesting indicator of website quality.
2025-05-19 14:07:41 +02:00
Viktor Lofgren
1c128e6d82 (crawler) Add request time to crawl data
This is an interesting indicator of website quality.
2025-05-19 14:02:03 +02:00
Viktor Lofgren
be039d1a8c (live-capture) Add a new function for capturing the DOM of a website after rendering
The new code injects a javascript that attempts to trigger popovers, and then alters the DOM to add attributes containing CSS elements with position and visibility.
2025-05-19 13:26:07 +02:00
Viktor Lofgren
4edc0d3267 (converter) Increase work buffer for converter
Conversion on index node  7 in production is crashing ostensibly because this buffer is too small.
2025-05-18 13:22:44 +02:00
Viktor Lofgren
890f521d0d (pdf) Fix crash for some bold lines 2025-05-18 13:05:05 +02:00
Viktor Lofgren
b1814a30f7 (deploy) Redeploy all services. 2025-05-17 13:11:51 +02:00
Viktor Lofgren
f59a9eb025 (legacy-search) Soften domain limit constraints in URL deduplication 2025-05-17 00:04:27 +02:00
Viktor Lofgren
599534806b (search) Soften domain limit constraints in URL deduplication 2025-05-17 00:00:42 +02:00
Viktor Lofgren
7e8253dac7 (search) Clean up debug logging 2025-05-17 00:00:28 +02:00
66 changed files with 2066 additions and 256 deletions

View File

@@ -38,14 +38,6 @@ associated with each language added, at least a models file or two, as well as s
It would be very helpful to find a speaker of a large language other than English to help in the fine tuning. It would be very helpful to find a speaker of a large language other than English to help in the fine tuning.
## Support for binary formats like PDF
The crawler needs to be modified to retain them, and the conversion logic needs to parse them.
The documents database probably should have some sort of flag indicating it's a PDF as well.
PDF parsing is known to be a bit of a security liability so some thought needs to be put in
that direction as well.
## Custom ranking logic ## Custom ranking logic
Stract does an interesting thing where they have configurable search filters. Stract does an interesting thing where they have configurable search filters.
@@ -66,6 +58,14 @@ One of the search engine's biggest limitations right now is that it does not ind
# Completed # Completed
## Support for binary formats like PDF (COMPLETED 2025-05)
The crawler needs to be modified to retain them, and the conversion logic needs to parse them.
The documents database probably should have some sort of flag indicating it's a PDF as well.
PDF parsing is known to be a bit of a security liability so some thought needs to be put in
that direction as well.
## Web Design Overhaul (COMPLETED 2025-01) ## Web Design Overhaul (COMPLETED 2025-01)
The design is kinda clunky and hard to maintain, and needlessly outdated-looking. The design is kinda clunky and hard to maintain, and needlessly outdated-looking.

View File

@@ -0,0 +1,5 @@
CREATE TABLE IF NOT EXISTS WMSA_prod.NSFW_DOMAINS (
ID INT NOT NULL AUTO_INCREMENT,
TIER INT NOT NULL,
PRIMARY KEY (ID)
);

View File

@@ -112,14 +112,6 @@ public class EdgeDomain implements Serializable {
return topDomain; return topDomain;
} }
public String getDomainKey() {
int cutPoint = topDomain.indexOf('.');
if (cutPoint < 0) {
return topDomain;
}
return topDomain.substring(0, cutPoint).toLowerCase();
}
/** If possible, try to provide an alias domain, /** If possible, try to provide an alias domain,
* i.e. a domain name that is very likely to link to this one * i.e. a domain name that is very likely to link to this one
* */ * */

View File

@@ -8,14 +8,6 @@ import static org.junit.jupiter.api.Assertions.assertEquals;
class EdgeDomainTest { class EdgeDomainTest {
@Test
public void testSkepdic() throws URISyntaxException {
var domain = new EdgeUrl("http://www.skepdic.com/astrology.html");
assertEquals("skepdic", domain.getDomain().getDomainKey());
var domain2 = new EdgeUrl("http://skepdic.com/astrology.html");
assertEquals("skepdic", domain2.getDomain().getDomainKey());
}
@Test @Test
public void testHkDomain() throws URISyntaxException { public void testHkDomain() throws URISyntaxException {
var domain = new EdgeUrl("http://l7072i3.l7c.net"); var domain = new EdgeUrl("http://l7072i3.l7c.net");

View File

@@ -37,6 +37,7 @@ dependencies {
implementation project(':code:functions:link-graph:api') implementation project(':code:functions:link-graph:api')
implementation project(':code:functions:live-capture:api') implementation project(':code:functions:live-capture:api')
implementation project(':code:functions:search-query') implementation project(':code:functions:search-query')
implementation project(':code:functions:nsfw-domain-filter')
implementation project(':code:execution:api') implementation project(':code:execution:api')
implementation project(':code:processes:crawling-process:model') implementation project(':code:processes:crawling-process:model')

View File

@@ -6,6 +6,7 @@ import java.util.Set;
public enum ExecutorActor { public enum ExecutorActor {
PREC_EXPORT_ALL(NodeProfile.BATCH_CRAWL, NodeProfile.MIXED), PREC_EXPORT_ALL(NodeProfile.BATCH_CRAWL, NodeProfile.MIXED),
SYNC_NSFW_LISTS(NodeProfile.BATCH_CRAWL, NodeProfile.MIXED),
CRAWL(NodeProfile.BATCH_CRAWL, NodeProfile.MIXED), CRAWL(NodeProfile.BATCH_CRAWL, NodeProfile.MIXED),
RECRAWL(NodeProfile.BATCH_CRAWL, NodeProfile.MIXED), RECRAWL(NodeProfile.BATCH_CRAWL, NodeProfile.MIXED),
@@ -35,7 +36,8 @@ public enum ExecutorActor {
LIVE_CRAWL(NodeProfile.REALTIME), LIVE_CRAWL(NodeProfile.REALTIME),
PROC_LIVE_CRAWL_SPAWNER(NodeProfile.REALTIME), PROC_LIVE_CRAWL_SPAWNER(NodeProfile.REALTIME),
SCRAPE_FEEDS(NodeProfile.REALTIME), SCRAPE_FEEDS(NodeProfile.REALTIME),
UPDATE_RSS(NodeProfile.REALTIME); UPDATE_RSS(NodeProfile.REALTIME)
;
public String id() { public String id() {
return "fsm:" + name().toLowerCase(); return "fsm:" + name().toLowerCase();

View File

@@ -68,6 +68,7 @@ public class ExecutorActorControlService {
ExecutorActorStateMachines stateMachines, ExecutorActorStateMachines stateMachines,
MigrateCrawlDataActor migrateCrawlDataActor, MigrateCrawlDataActor migrateCrawlDataActor,
ExportAllPrecessionActor exportAllPrecessionActor, ExportAllPrecessionActor exportAllPrecessionActor,
UpdateNsfwFiltersActor updateNsfwFiltersActor,
UpdateRssActor updateRssActor) throws SQLException { UpdateRssActor updateRssActor) throws SQLException {
this.messageQueueFactory = messageQueueFactory; this.messageQueueFactory = messageQueueFactory;
this.eventLog = baseServiceParams.eventLog; this.eventLog = baseServiceParams.eventLog;
@@ -109,6 +110,7 @@ public class ExecutorActorControlService {
register(ExecutorActor.UPDATE_RSS, updateRssActor); register(ExecutorActor.UPDATE_RSS, updateRssActor);
register(ExecutorActor.MIGRATE_CRAWL_DATA, migrateCrawlDataActor); register(ExecutorActor.MIGRATE_CRAWL_DATA, migrateCrawlDataActor);
register(ExecutorActor.SYNC_NSFW_LISTS, updateNsfwFiltersActor);
if (serviceConfiguration.node() == 1) { if (serviceConfiguration.node() == 1) {
register(ExecutorActor.PREC_EXPORT_ALL, exportAllPrecessionActor); register(ExecutorActor.PREC_EXPORT_ALL, exportAllPrecessionActor);

View File

@@ -0,0 +1,53 @@
package nu.marginalia.actor.task;
import com.google.gson.Gson;
import com.google.inject.Inject;
import com.google.inject.Singleton;
import nu.marginalia.actor.prototype.RecordActorPrototype;
import nu.marginalia.actor.state.ActorStep;
import nu.marginalia.nsfw.NsfwDomainFilter;
import nu.marginalia.service.module.ServiceConfiguration;
@Singleton
public class UpdateNsfwFiltersActor extends RecordActorPrototype {
private final ServiceConfiguration serviceConfiguration;
private final NsfwDomainFilter nsfwDomainFilter;
public record Initial() implements ActorStep {}
public record Run() implements ActorStep {}
@Override
public ActorStep transition(ActorStep self) throws Exception {
return switch(self) {
case Initial() -> {
if (serviceConfiguration.node() != 1) {
yield new Error("This actor can only run on node 1");
}
else {
yield new Run();
}
}
case Run() -> {
nsfwDomainFilter.fetchLists();
yield new End();
}
default -> new Error();
};
}
@Override
public String describe() {
return "Sync NSFW filters";
}
@Inject
public UpdateNsfwFiltersActor(Gson gson,
ServiceConfiguration serviceConfiguration,
NsfwDomainFilter nsfwDomainFilter)
{
super(gson);
this.serviceConfiguration = serviceConfiguration;
this.nsfwDomainFilter = nsfwDomainFilter;
}
}

View File

@@ -25,9 +25,9 @@ dependencies {
implementation project(':code:execution:api') implementation project(':code:execution:api')
implementation project(':code:processes:crawling-process:ft-content-type') implementation project(':code:processes:crawling-process:ft-content-type')
implementation project(':third-party:rssreader')
implementation libs.jsoup implementation libs.jsoup
implementation project(':third-party:rssreader')
implementation libs.opencsv implementation libs.opencsv
implementation libs.slop implementation libs.slop
implementation libs.sqlite implementation libs.sqlite
@@ -57,8 +57,6 @@ dependencies {
implementation libs.bundles.gson implementation libs.bundles.gson
implementation libs.bundles.mariadb implementation libs.bundles.mariadb
testImplementation libs.bundles.slf4j.test testImplementation libs.bundles.slf4j.test
testImplementation libs.bundles.junit testImplementation libs.bundles.junit
testImplementation libs.mockito testImplementation libs.mockito

View File

@@ -0,0 +1,126 @@
package nu.marginalia.domsample;
import com.google.inject.Inject;
import com.zaxxer.hikari.HikariDataSource;
import jakarta.inject.Named;
import nu.marginalia.domsample.db.DomSampleDb;
import nu.marginalia.livecapture.BrowserlessClient;
import nu.marginalia.service.module.ServiceConfiguration;
import org.apache.commons.lang3.StringUtils;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.net.URI;
import java.net.URISyntaxException;
import java.time.Duration;
import java.util.HashSet;
import java.util.Set;
import java.util.concurrent.TimeUnit;
public class DomSampleService {
private final DomSampleDb db;
private final HikariDataSource mariadbDataSource;
private final URI browserlessURI;
private static final Logger logger = LoggerFactory.getLogger(DomSampleService.class);
@Inject
public DomSampleService(DomSampleDb db,
HikariDataSource mariadbDataSource,
@Named("browserless-uri") String browserlessAddress,
ServiceConfiguration serviceConfiguration)
throws URISyntaxException
{
this.db = db;
this.mariadbDataSource = mariadbDataSource;
if (StringUtils.isEmpty(browserlessAddress) || serviceConfiguration.node() > 1) {
logger.warn("Live capture service will not run");
browserlessURI = null;
}
else {
browserlessURI = new URI(browserlessAddress);
}
}
public void start() {
if (browserlessURI == null) {
logger.warn("DomSampleService is not enabled due to missing browserless URI or multi-node configuration");
return;
}
Thread.ofPlatform().daemon().start(this::run);
}
public void syncDomains() {
Set<String> dbDomains = new HashSet<>();
logger.info("Fetching domains from database...");
try (var conn = mariadbDataSource.getConnection();
var stmt = conn.prepareStatement("""
SELECT DOMAIN_NAME
FROM EC_DOMAIN
WHERE NODE_AFFINITY>0
""")
) {
var rs = stmt.executeQuery();
while (rs.next()) {
dbDomains.add(rs.getString("DOMAIN_NAME"));
}
} catch (Exception e) {
throw new RuntimeException("Failed to sync domains", e);
}
logger.info("Found {} domains in database", dbDomains.size());
db.syncDomains(dbDomains);
logger.info("Synced domains to sqlite");
}
public void run() {
try (var client = new BrowserlessClient(browserlessURI)) {
while (!Thread.currentThread().isInterrupted()) {
try {
// Grace sleep in case we're operating on an empty domain list
TimeUnit.SECONDS.sleep(15);
syncDomains();
var domains = db.getScheduledDomains();
for (var domain : domains) {
updateDomain(client, domain);
}
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
logger.info("DomSampleService interrupted, stopping...");
return;
} catch (Exception e) {
logger.error("Error in DomSampleService run loop", e);
}
}
}
}
private void updateDomain(BrowserlessClient client, String domain) {
var rootUrl = "https://" + domain + "/";
try {
var content = client.annotatedContent(rootUrl, new BrowserlessClient.GotoOptions("load", Duration.ofSeconds(10).toMillis()));
if (content.isPresent()) {
db.saveSample(domain, rootUrl, content.get());
}
} catch (Exception e) {
logger.error("Failed to process domain: " + domain, e);
}
finally {
db.flagDomainAsFetched(domain);
}
}
}

View File

@@ -0,0 +1,174 @@
package nu.marginalia.domsample.db;
import nu.marginalia.WmsaHome;
import org.jsoup.Jsoup;
import java.nio.file.Path;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.SQLException;
import java.util.*;
public class DomSampleDb implements AutoCloseable {
private static final String dbFileName = "dom-sample.db";
private final Connection connection;
public DomSampleDb() throws SQLException{
this(WmsaHome.getDataPath().resolve(dbFileName));
}
public DomSampleDb(Path dbPath) throws SQLException {
String dbUrl = "jdbc:sqlite:" + dbPath.toAbsolutePath();
connection = DriverManager.getConnection(dbUrl);
try (var stmt = connection.createStatement()) {
stmt.executeUpdate("CREATE TABLE IF NOT EXISTS samples (url TEXT PRIMARY KEY, domain TEXT, sample BLOB, requests BLOB, accepted_popover BOOLEAN DEFAULT FALSE)");
stmt.executeUpdate("CREATE INDEX IF NOT EXISTS domain_index ON samples (domain)");
stmt.executeUpdate("CREATE TABLE IF NOT EXISTS schedule (domain TEXT PRIMARY KEY, last_fetch TIMESTAMP DEFAULT NULL)");
stmt.execute("PRAGMA journal_mode=WAL");
}
}
public void syncDomains(Set<String> domains) {
Set<String> currentDomains = new HashSet<>();
try (var stmt = connection.prepareStatement("SELECT domain FROM schedule")) {
var rs = stmt.executeQuery();
while (rs.next()) {
currentDomains.add(rs.getString("domain"));
}
} catch (SQLException e) {
throw new RuntimeException("Failed to sync domains", e);
}
Set<String> toRemove = new HashSet<>(currentDomains);
Set<String> toAdd = new HashSet<>(domains);
toRemove.removeAll(domains);
toAdd.removeAll(currentDomains);
try (var removeStmt = connection.prepareStatement("DELETE FROM schedule WHERE domain = ?");
var addStmt = connection.prepareStatement("INSERT OR IGNORE INTO schedule (domain) VALUES (?)")
) {
for (String domain : toRemove) {
removeStmt.setString(1, domain);
removeStmt.executeUpdate();
}
for (String domain : toAdd) {
addStmt.setString(1, domain);
addStmt.executeUpdate();
}
} catch (SQLException e) {
throw new RuntimeException("Failed to remove domains", e);
}
}
public List<String> getScheduledDomains() {
List<String> domains = new ArrayList<>();
try (var stmt = connection.prepareStatement("SELECT domain FROM schedule ORDER BY last_fetch IS NULL DESC, last_fetch ASC")) {
var rs = stmt.executeQuery();
while (rs.next()) {
domains.add(rs.getString("domain"));
}
} catch (SQLException e) {
throw new RuntimeException("Failed to get scheduled domains", e);
}
return domains;
}
public void flagDomainAsFetched(String domain) {
try (var stmt = connection.prepareStatement("INSERT OR REPLACE INTO schedule (domain, last_fetch) VALUES (?, CURRENT_TIMESTAMP)")) {
stmt.setString(1, domain);
stmt.executeUpdate();
} catch (SQLException e) {
throw new RuntimeException("Failed to flag domain as fetched", e);
}
}
public record Sample(String url, String domain, String sample, String requests, boolean acceptedPopover) {}
public List<Sample> getSamples(String domain) throws SQLException {
List<Sample> samples = new ArrayList<>();
try (var stmt = connection.prepareStatement("""
SELECT url, sample, requests, accepted_popover
FROM samples
WHERE domain = ?
"""))
{
stmt.setString(1, domain);
var rs = stmt.executeQuery();
while (rs.next()) {
samples.add(
new Sample(
rs.getString("url"),
domain,
rs.getString("sample"),
rs.getString("requests"),
rs.getBoolean("accepted_popover")
)
);
}
}
return samples;
}
public void saveSample(String domain, String url, String rawContent) throws SQLException {
var doc = Jsoup.parse(rawContent);
var networkRequests = doc.getElementById("marginalia-network-requests");
boolean acceptedPopover = false;
StringBuilder requestTsv = new StringBuilder();
if (networkRequests != null) {
acceptedPopover = !networkRequests.getElementsByClass("marginalia-agreed-cookies").isEmpty();
for (var request : networkRequests.getElementsByClass("network-request")) {
String method = request.attr("data-method");
String urlAttr = request.attr("data-url");
String timestamp = request.attr("data-timestamp");
requestTsv
.append(method)
.append('\t')
.append(timestamp)
.append('\t')
.append(urlAttr.replace('\n', ' '))
.append("\n");
}
networkRequests.remove();
}
doc.body().removeAttr("id");
String sample = doc.html();
saveSampleRaw(domain, url, sample, requestTsv.toString().trim(), acceptedPopover);
}
public void saveSampleRaw(String domain, String url, String sample, String requests, boolean acceptedPopover) throws SQLException {
try (var stmt = connection.prepareStatement("""
INSERT OR REPLACE
INTO samples (domain, url, sample, requests, accepted_popover)
VALUES (?, ?, ?, ?, ?)
""")) {
stmt.setString(1, domain);
stmt.setString(2, url);
stmt.setString(3, sample);
stmt.setString(4, requests);
stmt.setBoolean(5, acceptedPopover);
stmt.executeUpdate();
}
}
public void close() throws SQLException {
connection.close();
}
}

View File

@@ -8,10 +8,13 @@ import org.slf4j.LoggerFactory;
import java.io.IOException; import java.io.IOException;
import java.net.URI; import java.net.URI;
import java.net.URLEncoder;
import java.net.http.HttpClient; import java.net.http.HttpClient;
import java.net.http.HttpRequest; import java.net.http.HttpRequest;
import java.net.http.HttpResponse; import java.net.http.HttpResponse;
import java.nio.charset.StandardCharsets;
import java.time.Duration; import java.time.Duration;
import java.util.List;
import java.util.Map; import java.util.Map;
import java.util.Optional; import java.util.Optional;
@@ -60,6 +63,42 @@ public class BrowserlessClient implements AutoCloseable {
return Optional.of(rsp.body()); return Optional.of(rsp.body());
} }
/** Fetches content with a marginalia hack extension loaded that decorates the DOM with attributes for
* certain CSS attributes, to be able to easier identify popovers and other nuisance elements.
*/
public Optional<String> annotatedContent(String url, GotoOptions gotoOptions) throws IOException, InterruptedException {
Map<String, Object> requestData = Map.of(
"url", url,
"userAgent", userAgent,
"gotoOptions", gotoOptions,
"waitForSelector", Map.of("selector", "#marginaliahack", "timeout", 15000)
);
// Launch parameters for the browserless instance to load the extension
Map<String, Object> launchParameters = Map.of(
"args", List.of("--load-extension=/dom-export")
);
String launchParametersStr = URLEncoder.encode(gson.toJson(launchParameters), StandardCharsets.UTF_8);
var request = HttpRequest.newBuilder()
.uri(browserlessURI.resolve("/content?token="+BROWSERLESS_TOKEN+"&launch="+launchParametersStr))
.method("POST", HttpRequest.BodyPublishers.ofString(
gson.toJson(requestData)
))
.header("Content-type", "application/json")
.build();
var rsp = httpClient.send(request, HttpResponse.BodyHandlers.ofString());
if (rsp.statusCode() >= 300) {
logger.info("Failed to fetch annotated content for {}, status {}", url, rsp.statusCode());
return Optional.empty();
}
return Optional.of(rsp.body());
}
public byte[] screenshot(String url, GotoOptions gotoOptions, ScreenshotOptions screenshotOptions) public byte[] screenshot(String url, GotoOptions gotoOptions, ScreenshotOptions screenshotOptions)
throws IOException, InterruptedException { throws IOException, InterruptedException {

View File

@@ -126,7 +126,6 @@ public class LiveCaptureGrpcService
} }
else { else {
EdgeDomain domain = domainNameOpt.get(); EdgeDomain domain = domainNameOpt.get();
String domainNameStr = domain.toString();
if (!isValidDomainForCapture(domain)) { if (!isValidDomainForCapture(domain)) {
ScreenshotDbOperations.flagDomainAsFetched(conn, domain); ScreenshotDbOperations.flagDomainAsFetched(conn, domain);

View File

@@ -0,0 +1,113 @@
package nu.marginalia.domsample.db;
import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.testcontainers.shaded.org.apache.commons.io.FileUtils;
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.*;
import static org.junit.jupiter.api.Assertions.*;
class DomSampleDbTest {
Path tempDir;
@BeforeEach
void setUp() throws Exception {
tempDir = Files.createTempDirectory("test");
}
@AfterEach
void tearDown() throws IOException {
FileUtils.deleteDirectory(tempDir.toFile());
}
@Test
public void testSetUp() {
var dbPath = tempDir.resolve("test.db");
try (var db = new DomSampleDb(dbPath)) {
}
catch (Exception e) {
fail("Failed to set up database: " + e.getMessage());
}
}
@Test
public void testSyncDomains() {
var dbPath = tempDir.resolve("test.db");
try (var db = new DomSampleDb(dbPath)) {
db.syncDomains(Set.of("example.com", "test.com", "foobar.com"));
assertEquals(Set.of("example.com", "test.com", "foobar.com"), new HashSet<>(db.getScheduledDomains()));
db.syncDomains(Set.of("example.com", "test.com"));
assertEquals(Set.of("example.com", "test.com"), new HashSet<>(db.getScheduledDomains()));
db.syncDomains(Set.of("foobar.com", "test.com"));
assertEquals(Set.of("foobar.com", "test.com"), new HashSet<>(db.getScheduledDomains()));
}
catch (Exception e) {
fail("Failed to sync domains: " + e.getMessage());
}
}
@Test
public void testFetchDomains() {
var dbPath = tempDir.resolve("test.db");
try (var db = new DomSampleDb(dbPath)) {
db.syncDomains(Set.of("example.com", "test.com", "foobar.com"));
db.flagDomainAsFetched("example.com");
db.flagDomainAsFetched("test.com");
db.flagDomainAsFetched("foobar.com");
assertEquals(List.of("example.com", "test.com", "foobar.com"), db.getScheduledDomains());
db.flagDomainAsFetched("test.com");
assertEquals(List.of("example.com", "foobar.com", "test.com"), db.getScheduledDomains());
}
catch (Exception e) {
fail("Failed to sync domains: " + e.getMessage());
}
}
@Test
public void saveLoadSingle() {
var dbPath = tempDir.resolve("test.db");
try (var db = new DomSampleDb(dbPath)) {
db.saveSampleRaw("example.com", "http://example.com/sample", "sample data", "requests data", true);
var samples = db.getSamples("example.com");
assertEquals(1, samples.size());
var sample = samples.getFirst();
assertEquals("example.com", sample.domain());
assertEquals("http://example.com/sample", sample.url());
assertEquals("sample data", sample.sample());
assertEquals("requests data", sample.requests());
assertTrue(sample.acceptedPopover());
}
catch (Exception e) {
fail("Failed to save/load sample: " + e.getMessage());
}
}
@Test
public void saveLoadTwo() {
var dbPath = tempDir.resolve("test.db");
try (var db = new DomSampleDb(dbPath)) {
db.saveSampleRaw("example.com", "http://example.com/sample", "sample data", "r1", true);
db.saveSampleRaw("example.com", "http://example.com/sample2", "sample data2", "r2", false);
var samples = db.getSamples("example.com");
assertEquals(2, samples.size());
Map<String, String> samplesByUrl = new HashMap<>();
for (var sample : samples) {
samplesByUrl.put(sample.url(), sample.sample());
}
assertEquals("sample data", samplesByUrl.get("http://example.com/sample"));
assertEquals("sample data2", samplesByUrl.get("http://example.com/sample2"));
}
catch (Exception e) {
fail("Failed to save/load sample: " + e.getMessage());
}
}
}

View File

@@ -3,17 +3,21 @@ package nu.marginalia.livecapture;
import com.github.tomakehurst.wiremock.WireMockServer; import com.github.tomakehurst.wiremock.WireMockServer;
import com.github.tomakehurst.wiremock.core.WireMockConfiguration; import com.github.tomakehurst.wiremock.core.WireMockConfiguration;
import nu.marginalia.WmsaHome; import nu.marginalia.WmsaHome;
import nu.marginalia.domsample.db.DomSampleDb;
import nu.marginalia.service.module.ServiceConfigurationModule; import nu.marginalia.service.module.ServiceConfigurationModule;
import org.junit.jupiter.api.Assertions; import org.junit.jupiter.api.Assertions;
import org.junit.jupiter.api.BeforeAll; import org.junit.jupiter.api.BeforeAll;
import org.junit.jupiter.api.Tag; import org.junit.jupiter.api.Tag;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;
import org.testcontainers.containers.GenericContainer; import org.testcontainers.containers.GenericContainer;
import org.testcontainers.images.PullPolicy;
import org.testcontainers.junit.jupiter.Testcontainers; import org.testcontainers.junit.jupiter.Testcontainers;
import org.testcontainers.utility.DockerImageName; import org.testcontainers.utility.DockerImageName;
import java.io.IOException; import java.io.IOException;
import java.net.URI; import java.net.URI;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.Map; import java.util.Map;
import static com.github.tomakehurst.wiremock.client.WireMock.*; import static com.github.tomakehurst.wiremock.client.WireMock.*;
@@ -22,9 +26,14 @@ import static com.github.tomakehurst.wiremock.client.WireMock.*;
@Testcontainers @Testcontainers
@Tag("slow") @Tag("slow")
public class BrowserlessClientTest { public class BrowserlessClientTest {
static GenericContainer<?> container = new GenericContainer<>(DockerImageName.parse("browserless/chrome")) // Run gradle docker if this image is not available
static GenericContainer<?> container = new GenericContainer<>(DockerImageName.parse("marginalia-browserless"))
.withEnv(Map.of("TOKEN", "BROWSERLESS_TOKEN")) .withEnv(Map.of("TOKEN", "BROWSERLESS_TOKEN"))
.withImagePullPolicy(PullPolicy.defaultPolicy())
.withNetworkMode("bridge") .withNetworkMode("bridge")
.withLogConsumer(frame -> {
System.out.print(frame.getUtf8String());
})
.withExposedPorts(3000); .withExposedPorts(3000);
static WireMockServer wireMockServer = static WireMockServer wireMockServer =
@@ -34,6 +43,7 @@ public class BrowserlessClientTest {
static String localIp; static String localIp;
static URI browserlessURI; static URI browserlessURI;
static URI browserlessWssURI;
@BeforeAll @BeforeAll
public static void setup() throws IOException { public static void setup() throws IOException {
@@ -44,6 +54,12 @@ public class BrowserlessClientTest {
container.getMappedPort(3000)) container.getMappedPort(3000))
); );
browserlessWssURI = URI.create(String.format("ws://%s:%d/?token=BROWSERLESS_TOKEN",
container.getHost(),
container.getMappedPort(3000))
);
wireMockServer.start(); wireMockServer.start();
wireMockServer.stubFor(get("/").willReturn(aResponse().withStatus(200).withBody("Ok"))); wireMockServer.stubFor(get("/").willReturn(aResponse().withStatus(200).withBody("Ok")));
@@ -85,6 +101,30 @@ public class BrowserlessClientTest {
} }
} }
@Test
public void testAnnotatedContent() throws Exception {
try (var client = new BrowserlessClient(browserlessURI);
DomSampleDb dbop = new DomSampleDb(Path.of("/tmp/dom-sample.db"))
) {
var content = client.annotatedContent("https://marginalia.nu/", BrowserlessClient.GotoOptions.defaultValues()).orElseThrow();
dbop.saveSample("marginalia.nu", "https://marginalia.nu/", content);
System.out.println(content);
Assertions.assertFalse(content.isBlank(), "Content should not be empty");
dbop.getSamples("marginalia.nu").forEach(sample -> {
System.out.println("Sample URL: " + sample.url());
System.out.println("Sample Content: " + sample.sample());
System.out.println("Sample Requests: " + sample.requests());
System.out.println("Accepted Popover: " + sample.acceptedPopover());
});
}
finally {
Files.deleteIfExists(Path.of("/tmp/dom-sample.db"));
}
}
@Test @Test
public void testScreenshot() throws Exception { public void testScreenshot() throws Exception {
try (var client = new BrowserlessClient(browserlessURI)) { try (var client = new BrowserlessClient(browserlessURI)) {

View File

@@ -0,0 +1,43 @@
plugins {
id 'java'
id 'jvm-test-suite'
}
java {
toolchain {
languageVersion.set(JavaLanguageVersion.of(rootProject.ext.jvmVersion))
}
}
apply from: "$rootProject.projectDir/srcsets.gradle"
dependencies {
implementation project(':code:common:config')
implementation project(':code:common:model')
implementation project(':code:common:db')
implementation libs.bundles.slf4j
implementation libs.prometheus
implementation libs.guava
implementation libs.commons.lang3
implementation dependencies.create(libs.guice.get()) {
exclude group: 'com.google.guava'
}
implementation libs.notnull
implementation libs.fastutil
implementation libs.bundles.mariadb
testImplementation libs.bundles.slf4j.test
testImplementation libs.bundles.junit
testImplementation libs.mockito
testImplementation platform('org.testcontainers:testcontainers-bom:1.17.4')
testImplementation libs.commons.codec
testImplementation project(':code:common:service')
testImplementation 'org.testcontainers:mariadb:1.17.4'
testImplementation 'org.testcontainers:junit-jupiter:1.17.4'
testImplementation project(':code:libraries:test-helpers')
}

View File

@@ -0,0 +1,192 @@
package nu.marginalia.nsfw;
import com.google.inject.Inject;
import com.google.inject.Singleton;
import com.google.inject.name.Named;
import com.zaxxer.hikari.HikariDataSource;
import it.unimi.dsi.fastutil.ints.IntOpenHashSet;
import org.apache.commons.lang3.StringUtils;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.io.BufferedReader;
import java.io.ByteArrayInputStream;
import java.io.InputStreamReader;
import java.net.http.HttpClient;
import java.net.http.HttpRequest;
import java.net.http.HttpResponse;
import java.sql.SQLException;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.concurrent.TimeUnit;
import java.util.zip.GZIPInputStream;
@Singleton
public class NsfwDomainFilter {
private final HikariDataSource dataSource;
private final List<String> dangerLists;
private final List<String> smutLists;
private volatile IntOpenHashSet blockedDomainIdsTier1 = new IntOpenHashSet();
private volatile IntOpenHashSet blockedDomainIdsTier2 = new IntOpenHashSet();
private static final Logger logger = LoggerFactory.getLogger(NsfwDomainFilter.class);
public static final int NSFW_DISABLE = 0;
public static final int NSFW_BLOCK_DANGER = 1;
public static final int NSFW_BLOCK_SMUT = 2;
@Inject
public NsfwDomainFilter(HikariDataSource dataSource,
@Named("nsfw.dangerLists") List<String> dangerLists,
@Named("nsfw.smutLists") List<String> smutLists
) {
this.dataSource = dataSource;
this.dangerLists = dangerLists;
this.smutLists = smutLists;
Thread.ofPlatform().daemon().name("NsfwDomainFilterSync").start(() -> {
while (true) {
sync();
try {
TimeUnit.HOURS.sleep(1);
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
break; // Exit the loop if interrupted
}
}
});
}
public boolean isBlocked(int domainId, int tier) {
if (tier == 0)
return false;
if (tier >= 1 && blockedDomainIdsTier1.contains(domainId))
return true;
if (tier >= 2 && blockedDomainIdsTier2.contains(domainId))
return true;
return false;
}
private synchronized void sync() {
try (var conn = dataSource.getConnection();
var stmt = conn.prepareStatement("SELECT ID, TIER FROM NSFW_DOMAINS")
) {
var rs = stmt.executeQuery();
IntOpenHashSet tier1 = new IntOpenHashSet();
IntOpenHashSet tier2 = new IntOpenHashSet();
while (rs.next()) {
int domainId = rs.getInt("ID");
int tier = rs.getInt("TIER");
switch (tier) {
case 1 -> tier1.add(domainId);
case 2 -> tier2.add(domainId);
}
}
this.blockedDomainIdsTier1 = tier1;
this.blockedDomainIdsTier2 = tier2;
logger.info("NSFW domain filter synced: {} tier 1, {} tier 2", tier1.size(), tier2.size());
}
catch (SQLException ex) {
logger.error("Failed to sync NSFW domain filter", ex);
}
}
public synchronized void fetchLists() {
try (var conn = dataSource.getConnection();
HttpClient client = HttpClient.newBuilder()
.followRedirects(HttpClient.Redirect.ALWAYS)
.build();
var stmt = conn.createStatement();
var insertStmt = conn.prepareStatement("INSERT IGNORE INTO NSFW_DOMAINS_TMP (ID, TIER) SELECT ID, ? FROM EC_DOMAIN WHERE DOMAIN_NAME = ?")) {
stmt.execute("DROP TABLE IF EXISTS NSFW_DOMAINS_TMP");
stmt.execute("CREATE TABLE NSFW_DOMAINS_TMP LIKE NSFW_DOMAINS");
List<String> combinedDangerList = new ArrayList<>(10_000);
for (var dangerListUrl : dangerLists) {
combinedDangerList.addAll(fetchList(client, dangerListUrl));
}
for (String domain : combinedDangerList) {
insertStmt.setInt(1, NSFW_BLOCK_DANGER);
insertStmt.setString(2, domain);
insertStmt.execute();
}
List<String> combinedSmutList = new ArrayList<>(10_000);
for (var smutListUrl : smutLists) {
combinedSmutList.addAll(fetchList(client, smutListUrl));
}
for (String domain : combinedSmutList) {
insertStmt.setInt(1, NSFW_BLOCK_SMUT);
insertStmt.setString(2, domain);
insertStmt.addBatch();
insertStmt.execute();
}
stmt.execute("""
DROP TABLE IF EXISTS NSFW_DOMAINS
""");
stmt.execute("""
RENAME TABLE NSFW_DOMAINS_TMP TO NSFW_DOMAINS
""");
sync();
}
catch (SQLException ex) {
logger.error("Failed to fetch NSFW domain lists", ex);
}
}
public List<String> fetchList(HttpClient client, String url) {
logger.info("Fetching NSFW domain list from {}", url);
var request = HttpRequest.newBuilder()
.uri(java.net.URI.create(url))
.build();
try {
if (url.endsWith(".gz")) {
var response = client.send(request, HttpResponse.BodyHandlers.ofByteArray());
byte[] body = response.body();
try (var reader = new BufferedReader(new InputStreamReader(new GZIPInputStream(new ByteArrayInputStream(body))))) {
return reader.lines()
.filter(StringUtils::isNotEmpty)
.toList();
} catch (Exception e) {
logger.error("Error reading GZIP response from {}", url, e);
}
} else {
var response = client.send(request, HttpResponse.BodyHandlers.ofString());
if (response.statusCode() == 200) {
return Arrays.stream(StringUtils.split(response.body(), "\n"))
.filter(StringUtils::isNotEmpty)
.toList();
} else {
logger.warn("Failed to fetch list from {}: HTTP {}", url, response.statusCode());
}
}
}
catch (Exception e) {
logger.error("Error fetching NSFW domain list from {}", url, e);
}
return List.of();
}
}

View File

@@ -0,0 +1,30 @@
package nu.marginalia.nsfw;
import com.google.inject.AbstractModule;
import com.google.inject.Provides;
import jakarta.inject.Named;
import java.util.List;
public class NsfwFilterModule extends AbstractModule {
@Provides
@Named("nsfw.dangerLists")
public List<String> nsfwDomainLists1() {
return List.of(
"https://raw.githubusercontent.com/olbat/ut1-blacklists/refs/heads/master/blacklists/cryptojacking/domains",
"https://raw.githubusercontent.com/olbat/ut1-blacklists/refs/heads/master/blacklists/malware/domains",
"https://raw.githubusercontent.com/olbat/ut1-blacklists/refs/heads/master/blacklists/phishing/domains"
);
}
@Provides
@Named("nsfw.smutLists")
public List<String> nsfwDomainLists2() {
return List.of(
"https://github.com/olbat/ut1-blacklists/raw/refs/heads/master/blacklists/adult/domains.gz",
"https://raw.githubusercontent.com/olbat/ut1-blacklists/refs/heads/master/blacklists/gambling/domains"
);
}
public void configure() {}
}

View File

@@ -0,0 +1,108 @@
package nu.marginalia.nsfw;
import com.google.inject.AbstractModule;
import com.google.inject.Guice;
import com.google.inject.Provides;
import com.zaxxer.hikari.HikariConfig;
import com.zaxxer.hikari.HikariDataSource;
import jakarta.inject.Named;
import nu.marginalia.test.TestMigrationLoader;
import org.junit.jupiter.api.BeforeAll;
import org.junit.jupiter.api.Tag;
import org.junit.jupiter.api.Test;
import org.testcontainers.containers.MariaDBContainer;
import org.testcontainers.junit.jupiter.Container;
import org.testcontainers.junit.jupiter.Testcontainers;
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.List;
import static org.junit.jupiter.api.Assertions.assertFalse;
import static org.junit.jupiter.api.Assertions.assertTrue;
@Tag("slow")
@Testcontainers
class NsfwDomainFilterTest extends AbstractModule {
@Container
static MariaDBContainer<?> mariaDBContainer = new MariaDBContainer<>("mariadb")
.withDatabaseName("WMSA_prod")
.withUsername("wmsa")
.withPassword("wmsa")
.withNetworkAliases("mariadb");
static HikariDataSource dataSource;
static Path tempDir;
@BeforeAll
public static void setUpDb() throws IOException {
tempDir = Files.createTempDirectory(NsfwDomainFilterTest.class.getSimpleName());
System.setProperty("system.homePath", tempDir.toString());
HikariConfig config = new HikariConfig();
config.setJdbcUrl(mariaDBContainer.getJdbcUrl());
config.setUsername("wmsa");
config.setPassword("wmsa");
dataSource = new HikariDataSource(config);
TestMigrationLoader.flywayMigration(dataSource);
try (var conn = dataSource.getConnection();
var stmt = conn.prepareStatement("INSERT INTO EC_DOMAIN (DOMAIN_NAME, DOMAIN_TOP, NODE_AFFINITY) VALUES (?, ?, 1)")
) {
// Ensure the database is ready
conn.createStatement().execute("SELECT 1");
stmt.setString(1, "www.google.com");
stmt.setString(2, "google.com");
stmt.executeUpdate();
stmt.setString(1, "www.bing.com");
stmt.setString(2, "bing.com");
stmt.executeUpdate();
} catch (Exception e) {
throw new RuntimeException("Failed to connect to the database", e);
}
}
@Provides
@Named("nsfw.dangerLists")
public List<String> nsfwDomainLists1() {
return List.of(
"https://downloads.marginalia.nu/test/list1"
);
}
@Provides
@Named("nsfw.smutLists")
public List<String> nsfwDomainLists2() {
return List.of(
"https://downloads.marginalia.nu/test/list2.gz"
);
}
public void configure() {
bind(HikariDataSource.class).toInstance(dataSource);
}
@Test
public void test() {
var filter = Guice
.createInjector(this)
.getInstance(NsfwDomainFilter.class);
filter.fetchLists();
assertTrue(filter.isBlocked(1, NsfwDomainFilter.NSFW_BLOCK_DANGER));
assertTrue(filter.isBlocked(1, NsfwDomainFilter.NSFW_BLOCK_SMUT));
assertFalse(filter.isBlocked(2, NsfwDomainFilter.NSFW_BLOCK_DANGER));
assertTrue(filter.isBlocked(2, NsfwDomainFilter.NSFW_BLOCK_SMUT));
}
}

View File

@@ -1,9 +1,6 @@
package nu.marginalia.api.searchquery; package nu.marginalia.api.searchquery;
import nu.marginalia.api.searchquery.model.query.ProcessedQuery; import nu.marginalia.api.searchquery.model.query.*;
import nu.marginalia.api.searchquery.model.query.QueryParams;
import nu.marginalia.api.searchquery.model.query.QueryResponse;
import nu.marginalia.api.searchquery.model.query.SearchSpecification;
import nu.marginalia.api.searchquery.model.results.DecoratedSearchResultItem; import nu.marginalia.api.searchquery.model.results.DecoratedSearchResultItem;
import nu.marginalia.api.searchquery.model.results.PrototypeRankingParameters; import nu.marginalia.api.searchquery.model.results.PrototypeRankingParameters;
import nu.marginalia.api.searchquery.model.results.SearchResultItem; import nu.marginalia.api.searchquery.model.results.SearchResultItem;
@@ -32,6 +29,8 @@ public class QueryProtobufCodec {
builder.setSearchSetIdentifier(query.specs.searchSetIdentifier); builder.setSearchSetIdentifier(query.specs.searchSetIdentifier);
builder.setHumanQuery(request.getHumanQuery()); builder.setHumanQuery(request.getHumanQuery());
builder.setNsfwFilterTierValue(request.getNsfwFilterTierValue());
builder.setQuality(IndexProtobufCodec.convertSpecLimit(query.specs.quality)); builder.setQuality(IndexProtobufCodec.convertSpecLimit(query.specs.quality));
builder.setYear(IndexProtobufCodec.convertSpecLimit(query.specs.year)); builder.setYear(IndexProtobufCodec.convertSpecLimit(query.specs.year));
builder.setSize(IndexProtobufCodec.convertSpecLimit(query.specs.size)); builder.setSize(IndexProtobufCodec.convertSpecLimit(query.specs.size));
@@ -78,6 +77,8 @@ public class QueryProtobufCodec {
builder.setSearchSetIdentifier(query.specs.searchSetIdentifier); builder.setSearchSetIdentifier(query.specs.searchSetIdentifier);
builder.setHumanQuery(humanQuery); builder.setHumanQuery(humanQuery);
builder.setNsfwFilterTier(RpcIndexQuery.NSFW_FILTER_TIER.DANGER);
builder.setQuality(IndexProtobufCodec.convertSpecLimit(query.specs.quality)); builder.setQuality(IndexProtobufCodec.convertSpecLimit(query.specs.quality));
builder.setYear(IndexProtobufCodec.convertSpecLimit(query.specs.year)); builder.setYear(IndexProtobufCodec.convertSpecLimit(query.specs.year));
builder.setSize(IndexProtobufCodec.convertSpecLimit(query.specs.size)); builder.setSize(IndexProtobufCodec.convertSpecLimit(query.specs.size));
@@ -112,6 +113,7 @@ public class QueryProtobufCodec {
request.getSearchSetIdentifier(), request.getSearchSetIdentifier(),
QueryStrategy.valueOf(request.getQueryStrategy()), QueryStrategy.valueOf(request.getQueryStrategy()),
RpcTemporalBias.Bias.valueOf(request.getTemporalBias().getBias().name()), RpcTemporalBias.Bias.valueOf(request.getTemporalBias().getBias().name()),
NsfwFilterTier.fromCodedValue(request.getNsfwFilterTierValue()),
request.getPagination().getPage() request.getPagination().getPage()
); );
} }
@@ -327,6 +329,7 @@ public class QueryProtobufCodec {
.setRank(IndexProtobufCodec.convertSpecLimit(params.rank())) .setRank(IndexProtobufCodec.convertSpecLimit(params.rank()))
.setSearchSetIdentifier(params.identifier()) .setSearchSetIdentifier(params.identifier())
.setQueryStrategy(params.queryStrategy().name()) .setQueryStrategy(params.queryStrategy().name())
.setNsfwFilterTierValue(params.filterTier().getCodedValue())
.setTemporalBias(RpcTemporalBias.newBuilder() .setTemporalBias(RpcTemporalBias.newBuilder()
.setBias(RpcTemporalBias.Bias.valueOf(params.temporalBias().name())) .setBias(RpcTemporalBias.Bias.valueOf(params.temporalBias().name()))
.build()) .build())

View File

@@ -0,0 +1,26 @@
package nu.marginalia.api.searchquery.model.query;
public enum NsfwFilterTier {
OFF(0),
DANGER(1),
PORN_AND_GAMBLING(2);
private final int codedValue; // same as ordinal() for now, but can be changed later if needed
NsfwFilterTier(int codedValue) {
this.codedValue = codedValue;
}
public static NsfwFilterTier fromCodedValue(int codedValue) {
for (NsfwFilterTier tier : NsfwFilterTier.values()) {
if (tier.codedValue == codedValue) {
return tier;
}
}
throw new IllegalArgumentException("Invalid coded value for NsfwFilterTirer: " + codedValue);
}
public int getCodedValue() {
return codedValue;
}
}

View File

@@ -25,10 +25,11 @@ public record QueryParams(
String identifier, String identifier,
QueryStrategy queryStrategy, QueryStrategy queryStrategy,
RpcTemporalBias.Bias temporalBias, RpcTemporalBias.Bias temporalBias,
NsfwFilterTier filterTier,
int page int page
) )
{ {
public QueryParams(String query, RpcQueryLimits limits, String identifier) { public QueryParams(String query, RpcQueryLimits limits, String identifier, NsfwFilterTier filterTier) {
this(query, null, this(query, null,
List.of(), List.of(),
List.of(), List.of(),
@@ -43,6 +44,7 @@ public record QueryParams(
identifier, identifier,
QueryStrategy.AUTO, QueryStrategy.AUTO,
RpcTemporalBias.Bias.NONE, RpcTemporalBias.Bias.NONE,
filterTier,
1 // page 1 // page
); );
} }

View File

@@ -32,6 +32,14 @@ message RpcQsQuery {
RpcTemporalBias temporalBias = 16; RpcTemporalBias temporalBias = 16;
RpcQsQueryPagination pagination = 17; RpcQsQueryPagination pagination = 17;
NSFW_FILTER_TIER nsfwFilterTier = 18;
enum NSFW_FILTER_TIER {
NONE = 0;
DANGER = 1;
PORN_AND_GAMBLING = 2;
};
} }
/* Query service query response */ /* Query service query response */
@@ -78,8 +86,17 @@ message RpcIndexQuery {
RpcQueryLimits queryLimits = 10; RpcQueryLimits queryLimits = 10;
string queryStrategy = 11; // Named query configuration string queryStrategy = 11; // Named query configuration
RpcResultRankingParameters parameters = 12; RpcResultRankingParameters parameters = 12;
NSFW_FILTER_TIER nsfwFilterTier = 13;
enum NSFW_FILTER_TIER {
NONE = 0;
DANGER = 1;
PORN_AND_GAMBLING = 2;
};
} }
/* A tagged union encoding some limit on a field */ /* A tagged union encoding some limit on a field */
message RpcSpecLimit { message RpcSpecLimit {
int32 value = 1; int32 value = 1;

View File

@@ -19,6 +19,7 @@ dependencies {
implementation project(':code:common:model') implementation project(':code:common:model')
implementation project(':code:common:service') implementation project(':code:common:service')
implementation project(':code:functions:nsfw-domain-filter')
implementation project(':code:functions:search-query:api') implementation project(':code:functions:search-query:api')
implementation project(':code:index:query') implementation project(':code:index:query')

View File

@@ -11,6 +11,7 @@ import nu.marginalia.api.searchquery.model.query.QueryParams;
import nu.marginalia.api.searchquery.model.results.DecoratedSearchResultItem; import nu.marginalia.api.searchquery.model.results.DecoratedSearchResultItem;
import nu.marginalia.api.searchquery.model.results.PrototypeRankingParameters; import nu.marginalia.api.searchquery.model.results.PrototypeRankingParameters;
import nu.marginalia.index.api.IndexClient; import nu.marginalia.index.api.IndexClient;
import nu.marginalia.nsfw.NsfwDomainFilter;
import nu.marginalia.service.server.DiscoverableService; import nu.marginalia.service.server.DiscoverableService;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
@@ -34,13 +35,16 @@ public class QueryGRPCService
private final QueryFactory queryFactory; private final QueryFactory queryFactory;
private final NsfwDomainFilter nsfwDomainFilter;
private final IndexClient indexClient; private final IndexClient indexClient;
@Inject @Inject
public QueryGRPCService(QueryFactory queryFactory, public QueryGRPCService(QueryFactory queryFactory,
NsfwDomainFilter nsfwDomainFilter,
IndexClient indexClient) IndexClient indexClient)
{ {
this.queryFactory = queryFactory; this.queryFactory = queryFactory;
this.nsfwDomainFilter = nsfwDomainFilter;
this.indexClient = indexClient; this.indexClient = indexClient;
} }

View File

@@ -3,6 +3,7 @@ package nu.marginalia.query.svc;
import nu.marginalia.WmsaHome; import nu.marginalia.WmsaHome;
import nu.marginalia.api.searchquery.RpcQueryLimits; import nu.marginalia.api.searchquery.RpcQueryLimits;
import nu.marginalia.api.searchquery.RpcTemporalBias; import nu.marginalia.api.searchquery.RpcTemporalBias;
import nu.marginalia.api.searchquery.model.query.NsfwFilterTier;
import nu.marginalia.api.searchquery.model.query.QueryParams; import nu.marginalia.api.searchquery.model.query.QueryParams;
import nu.marginalia.api.searchquery.model.query.SearchSpecification; import nu.marginalia.api.searchquery.model.query.SearchSpecification;
import nu.marginalia.functions.searchquery.QueryFactory; import nu.marginalia.functions.searchquery.QueryFactory;
@@ -58,6 +59,7 @@ public class QueryFactoryTest {
"NONE", "NONE",
QueryStrategy.AUTO, QueryStrategy.AUTO,
RpcTemporalBias.Bias.NONE, RpcTemporalBias.Bias.NONE,
NsfwFilterTier.OFF,
0), null).specs; 0), null).specs;
} }

View File

@@ -17,6 +17,7 @@ dependencies {
implementation project(':code:common:service') implementation project(':code:common:service')
implementation project(':code:common:db') implementation project(':code:common:db')
implementation project(':code:libraries:message-queue') implementation project(':code:libraries:message-queue')
implementation project(':code:functions:nsfw-domain-filter')
implementation project(':code:functions:search-query:api') implementation project(':code:functions:search-query:api')
implementation libs.bundles.slf4j implementation libs.bundles.slf4j

View File

@@ -2,11 +2,13 @@ package nu.marginalia.index.api;
import com.google.inject.Inject; import com.google.inject.Inject;
import com.google.inject.Singleton; import com.google.inject.Singleton;
import io.prometheus.client.Counter;
import nu.marginalia.api.searchquery.IndexApiGrpc; import nu.marginalia.api.searchquery.IndexApiGrpc;
import nu.marginalia.api.searchquery.RpcDecoratedResultItem; import nu.marginalia.api.searchquery.RpcDecoratedResultItem;
import nu.marginalia.api.searchquery.RpcIndexQuery; import nu.marginalia.api.searchquery.RpcIndexQuery;
import nu.marginalia.db.DomainBlacklistImpl; import nu.marginalia.db.DomainBlacklistImpl;
import nu.marginalia.model.id.UrlIdCodec; import nu.marginalia.model.id.UrlIdCodec;
import nu.marginalia.nsfw.NsfwDomainFilter;
import nu.marginalia.service.client.GrpcChannelPoolFactory; import nu.marginalia.service.client.GrpcChannelPoolFactory;
import nu.marginalia.service.client.GrpcMultiNodeChannelPool; import nu.marginalia.service.client.GrpcMultiNodeChannelPool;
import nu.marginalia.service.discovery.property.ServiceKey; import nu.marginalia.service.discovery.property.ServiceKey;
@@ -28,14 +30,26 @@ public class IndexClient {
private static final Logger logger = LoggerFactory.getLogger(IndexClient.class); private static final Logger logger = LoggerFactory.getLogger(IndexClient.class);
private final GrpcMultiNodeChannelPool<IndexApiGrpc.IndexApiBlockingStub> channelPool; private final GrpcMultiNodeChannelPool<IndexApiGrpc.IndexApiBlockingStub> channelPool;
private final DomainBlacklistImpl blacklist; private final DomainBlacklistImpl blacklist;
private final NsfwDomainFilter nsfwDomainFilter;
Counter wmsa_index_query_count = Counter.build()
.name("wmsa_nsfw_filter_result_count")
.labelNames("tier")
.help("Count of results filtered by NSFW tier")
.register();
private static final ExecutorService executor = Executors.newCachedThreadPool(); private static final ExecutorService executor = Executors.newCachedThreadPool();
@Inject @Inject
public IndexClient(GrpcChannelPoolFactory channelPoolFactory, DomainBlacklistImpl blacklist) { public IndexClient(GrpcChannelPoolFactory channelPoolFactory,
DomainBlacklistImpl blacklist,
NsfwDomainFilter nsfwDomainFilter
) {
this.channelPool = channelPoolFactory.createMulti( this.channelPool = channelPoolFactory.createMulti(
ServiceKey.forGrpcApi(IndexApiGrpc.class, ServicePartition.multi()), ServiceKey.forGrpcApi(IndexApiGrpc.class, ServicePartition.multi()),
IndexApiGrpc::newBlockingStub); IndexApiGrpc::newBlockingStub);
this.blacklist = blacklist; this.blacklist = blacklist;
this.nsfwDomainFilter = nsfwDomainFilter;
} }
private static final Comparator<RpcDecoratedResultItem> comparator = private static final Comparator<RpcDecoratedResultItem> comparator =
@@ -52,7 +66,7 @@ public class IndexClient {
public AggregateQueryResponse executeQueries(RpcIndexQuery indexRequest, Pagination pagination) { public AggregateQueryResponse executeQueries(RpcIndexQuery indexRequest, Pagination pagination) {
final int requestedMaxResults = indexRequest.getQueryLimits().getResultsTotal(); final int requestedMaxResults = indexRequest.getQueryLimits().getResultsTotal();
int filterTier = indexRequest.getNsfwFilterTierValue();
AtomicInteger totalNumResults = new AtomicInteger(0); AtomicInteger totalNumResults = new AtomicInteger(0);
List<RpcDecoratedResultItem> results = List<RpcDecoratedResultItem> results =
@@ -74,7 +88,7 @@ public class IndexClient {
} }
}) })
.flatMap(List::stream) .flatMap(List::stream)
.filter(item -> !isBlacklisted(item)) .filter(item -> !isBlacklisted(item, filterTier))
.sorted(comparator) .sorted(comparator)
.skip(Math.max(0, (pagination.page - 1) * pagination.pageSize)) .skip(Math.max(0, (pagination.page - 1) * pagination.pageSize))
.limit(pagination.pageSize) .limit(pagination.pageSize)
@@ -83,8 +97,23 @@ public class IndexClient {
return new AggregateQueryResponse(results, pagination.page(), totalNumResults.get()); return new AggregateQueryResponse(results, pagination.page(), totalNumResults.get());
} }
private boolean isBlacklisted(RpcDecoratedResultItem item) { static String[] tierNames = {
return blacklist.isBlacklisted(UrlIdCodec.getDomainId(item.getRawItem().getCombinedId())); "OFF",
"DANGER",
"NSFW"
};
private boolean isBlacklisted(RpcDecoratedResultItem item, int filterTier) {
int domainId = UrlIdCodec.getDomainId(item.getRawItem().getCombinedId());
if (blacklist.isBlacklisted(domainId)) {
return true;
}
if (nsfwDomainFilter.isBlocked(domainId, filterTier)) {
wmsa_index_query_count.labels(tierNames[filterTier]).inc();
return true;
}
return false;
} }
} }

View File

@@ -84,7 +84,7 @@ public class ForwardIndexConverter {
LongArray docFileData = LongArrayFactory.mmapForWritingConfined(outputFileDocsData, ForwardIndexParameters.ENTRY_SIZE * docsFileId.size()); LongArray docFileData = LongArrayFactory.mmapForWritingConfined(outputFileDocsData, ForwardIndexParameters.ENTRY_SIZE * docsFileId.size());
ByteBuffer workArea = ByteBuffer.allocate(65536); ByteBuffer workArea = ByteBuffer.allocate(1024*1024*100);
for (var instance : journal.pages()) { for (var instance : journal.pages()) {
try (var slopTable = new SlopTable(instance.baseDir(), instance.page())) try (var slopTable = new SlopTable(instance.baseDir(), instance.page()))
{ {

View File

@@ -53,6 +53,7 @@ public class SideloaderProcessing {
"", "",
body.getBytes(StandardCharsets.UTF_8), body.getBytes(StandardCharsets.UTF_8),
false, false,
-1,
null, null,
null null
); );

View File

@@ -2002,12 +2002,11 @@ public class HeadingAwarePDFTextStripper extends LegacyPDFStreamEngine
float minFontWeight = Integer.MAX_VALUE; float minFontWeight = Integer.MAX_VALUE;
for (var word : line) for (var word : line)
{ {
int i = 0;
for (var textPosition : word.getTextPositions()) for (var textPosition : word.getTextPositions())
{ {
if (word.text.charAt(i++) == ' ') { // Skip empty text positions as they may have a different font
continue; if (word.text.isBlank()) continue;
}
var font = textPosition.getFont(); var font = textPosition.getFont();
if (font == null) continue; if (font == null) continue;
var descriptor = font.getFontDescriptor(); var descriptor = font.getFontDescriptor();

View File

@@ -148,6 +148,7 @@ public class ConvertingIntegrationTest {
"", "",
readClassPathFile(p.toString()).getBytes(), readClassPathFile(p.toString()).getBytes(),
false, false,
-1,
null, null,
null null
); );

View File

@@ -50,7 +50,7 @@ class PdfDocumentProcessorPluginTest {
)); ));
} }
public AbstractDocumentProcessorPlugin.DetailsWithWords testPdfFile(byte[] pdfBytes) throws Exception { public AbstractDocumentProcessorPlugin.DetailsWithWords testPdfFile(byte[] pdfBytes) throws Exception {
var doc = new CrawledDocument("test", "https://www.example.com/sample.pdf", "application/pdf", Instant.now().toString(), 200, "OK", "OK", "", pdfBytes, false, null, null); var doc = new CrawledDocument("test", "https://www.example.com/sample.pdf", "application/pdf", Instant.now().toString(), 200, "OK", "OK", "", pdfBytes, false, -1, null, null);
return plugin.createDetails(doc, new LinkTexts(), DocumentClass.NORMAL); return plugin.createDetails(doc, new LinkTexts(), DocumentClass.NORMAL);
} }

View File

@@ -10,6 +10,7 @@ import java.net.http.HttpClient;
import java.net.http.HttpHeaders; import java.net.http.HttpHeaders;
import java.net.http.HttpResponse; import java.net.http.HttpResponse;
import java.nio.charset.StandardCharsets; import java.nio.charset.StandardCharsets;
import java.time.Duration;
import java.util.*; import java.util.*;
import java.util.stream.Collectors; import java.util.stream.Collectors;
@@ -90,8 +91,8 @@ public class WarcProtocolReconstructor {
return "HTTP/" + version + " " + statusCode + " " + statusMessage + "\r\n" + headerString + "\r\n\r\n"; return "HTTP/" + version + " " + statusCode + " " + statusMessage + "\r\n" + headerString + "\r\n\r\n";
} }
static String getResponseHeader(ClassicHttpResponse response, long size) { static String getResponseHeader(ClassicHttpResponse response, Duration responseDuration, long size) {
String headerString = getHeadersAsString(response.getHeaders(), size); String headerString = getHeadersAsString(response.getHeaders(), responseDuration, size);
return response.getVersion().format() + " " + response.getCode() + " " + response.getReasonPhrase() + "\r\n" + headerString + "\r\n\r\n"; return response.getVersion().format() + " " + response.getCode() + " " + response.getReasonPhrase() + "\r\n" + headerString + "\r\n\r\n";
} }
@@ -160,7 +161,7 @@ public class WarcProtocolReconstructor {
static private String getHeadersAsString(Header[] headers, long responseSize) { static private String getHeadersAsString(Header[] headers, Duration responseDuration, long responseSize) {
StringJoiner joiner = new StringJoiner("\r\n"); StringJoiner joiner = new StringJoiner("\r\n");
for (var header : headers) { for (var header : headers) {
@@ -176,6 +177,7 @@ public class WarcProtocolReconstructor {
if (headerCapitalized.equals("Content-Encoding")) if (headerCapitalized.equals("Content-Encoding"))
continue; continue;
// Since we're transparently decoding gzip, we need to update the Content-Length header // Since we're transparently decoding gzip, we need to update the Content-Length header
// to reflect the actual size of the response body. We'll do this at the end. // to reflect the actual size of the response body. We'll do this at the end.
if (headerCapitalized.equals("Content-Length")) if (headerCapitalized.equals("Content-Length"))
@@ -184,6 +186,7 @@ public class WarcProtocolReconstructor {
joiner.add(headerCapitalized + ": " + header.getValue()); joiner.add(headerCapitalized + ": " + header.getValue());
} }
joiner.add("X-Marginalia-Response-Time: " + responseDuration.toMillis());
joiner.add("Content-Length: " + responseSize); joiner.add("Content-Length: " + responseSize);
return joiner.toString(); return joiner.toString();

View File

@@ -93,7 +93,7 @@ public class WarcRecorder implements AutoCloseable {
WarcDigestBuilder responseDigestBuilder = new WarcDigestBuilder(); WarcDigestBuilder responseDigestBuilder = new WarcDigestBuilder();
WarcDigestBuilder payloadDigestBuilder = new WarcDigestBuilder(); WarcDigestBuilder payloadDigestBuilder = new WarcDigestBuilder();
Instant date = Instant.now(); Instant requestDate = Instant.now();
// Not entirely sure why we need to do this, but keeping it due to Chesterton's Fence // Not entirely sure why we need to do this, but keeping it due to Chesterton's Fence
Map<String, List<String>> extraHeaders = new HashMap<>(request.getHeaders().length); Map<String, List<String>> extraHeaders = new HashMap<>(request.getHeaders().length);
@@ -108,6 +108,8 @@ public class WarcRecorder implements AutoCloseable {
try (WarcInputBuffer inputBuffer = WarcInputBuffer.forResponse(response, request, timeout); try (WarcInputBuffer inputBuffer = WarcInputBuffer.forResponse(response, request, timeout);
InputStream inputStream = inputBuffer.read()) { InputStream inputStream = inputBuffer.read()) {
Instant responseDate = Instant.now();
cookies.updateCookieStore(response); cookies.updateCookieStore(response);
// Build and write the request // Build and write the request
@@ -126,7 +128,7 @@ public class WarcRecorder implements AutoCloseable {
WarcRequest warcRequest = new WarcRequest.Builder(requestUri) WarcRequest warcRequest = new WarcRequest.Builder(requestUri)
.blockDigest(requestDigestBuilder.build()) .blockDigest(requestDigestBuilder.build())
.date(date) .date(requestDate)
.body(MediaType.HTTP_REQUEST, httpRequestString) .body(MediaType.HTTP_REQUEST, httpRequestString)
.build(); .build();
@@ -138,7 +140,9 @@ public class WarcRecorder implements AutoCloseable {
response.addHeader("X-Has-Cookies", 1); response.addHeader("X-Has-Cookies", 1);
} }
byte[] responseHeaders = WarcProtocolReconstructor.getResponseHeader(response, inputBuffer.size()).getBytes(StandardCharsets.UTF_8); byte[] responseHeaders = WarcProtocolReconstructor.getResponseHeader(response,
Duration.between(requestDate, responseDate),
inputBuffer.size()).getBytes(StandardCharsets.UTF_8);
ResponseDataBuffer responseDataBuffer = new ResponseDataBuffer(inputBuffer.size() + responseHeaders.length); ResponseDataBuffer responseDataBuffer = new ResponseDataBuffer(inputBuffer.size() + responseHeaders.length);
@@ -169,7 +173,7 @@ public class WarcRecorder implements AutoCloseable {
WarcResponse.Builder responseBuilder = new WarcResponse.Builder(responseUri) WarcResponse.Builder responseBuilder = new WarcResponse.Builder(responseUri)
.blockDigest(responseDigestBuilder.build()) .blockDigest(responseDigestBuilder.build())
.date(date) .date(responseDate)
.concurrentTo(warcRequest.id()) .concurrentTo(warcRequest.id())
.body(MediaType.HTTP_RESPONSE, responseDataBuffer.copyBytes()); .body(MediaType.HTTP_RESPONSE, responseDataBuffer.copyBytes());
@@ -184,7 +188,7 @@ public class WarcRecorder implements AutoCloseable {
warcResponse.http(); // force HTTP header to be parsed before body is consumed so that caller can use it warcResponse.http(); // force HTTP header to be parsed before body is consumed so that caller can use it
writer.write(warcResponse); writer.write(warcResponse);
if (Duration.between(date, Instant.now()).compareTo(Duration.ofSeconds(9)) > 0 if (Duration.between(requestDate, Instant.now()).compareTo(Duration.ofSeconds(9)) > 0
&& inputBuffer.size() < 2048 && inputBuffer.size() < 2048
&& !requestUri.getPath().endsWith("robots.txt")) // don't bail on robots.txt && !requestUri.getPath().endsWith("robots.txt")) // don't bail on robots.txt
{ {
@@ -196,7 +200,7 @@ public class WarcRecorder implements AutoCloseable {
logger.warn("URL {} took too long to fetch ({}s) and was too small for the effort ({}b)", logger.warn("URL {} took too long to fetch ({}s) and was too small for the effort ({}b)",
requestUri, requestUri,
Duration.between(date, Instant.now()).getSeconds(), Duration.between(requestDate, Instant.now()).getSeconds(),
inputBuffer.size() inputBuffer.size()
); );

View File

@@ -148,6 +148,7 @@ public class ParquetSerializableCrawlDataStream implements AutoCloseable, Serial
nextRecord.body, nextRecord.body,
// this field isn't actually used, maybe we can skip calculating it? // this field isn't actually used, maybe we can skip calculating it?
nextRecord.cookies, nextRecord.cookies,
-1,
lastModified, lastModified,
etag)); etag));
} }

View File

@@ -166,6 +166,7 @@ public class SlopSerializableCrawlDataStream implements AutoCloseable, Serializa
nextRecord.body(), nextRecord.body(),
// this field isn't actually used, maybe we can skip calculating it? // this field isn't actually used, maybe we can skip calculating it?
nextRecord.cookies(), nextRecord.cookies(),
nextRecord.requestTimeMs(),
null, null,
null)); null));
} }

View File

@@ -23,6 +23,7 @@ public final class CrawledDocument implements SerializableCrawlData {
public String crawlerStatus; public String crawlerStatus;
public String crawlerStatusDesc; public String crawlerStatusDesc;
public int requestTimeMs;
@Nullable @Nullable
public String headers; public String headers;
@@ -82,7 +83,7 @@ public final class CrawledDocument implements SerializableCrawlData {
public String lastModifiedMaybe; public String lastModifiedMaybe;
public String etagMaybe; public String etagMaybe;
public CrawledDocument(String crawlId, String url, String contentType, String timestamp, int httpStatus, String crawlerStatus, String crawlerStatusDesc, @Nullable String headers, byte[] documentBodyBytes, Boolean hasCookies, String lastModifiedMaybe, String etagMaybe) { public CrawledDocument(String crawlId, String url, String contentType, String timestamp, int httpStatus, String crawlerStatus, String crawlerStatusDesc, @Nullable String headers, byte[] documentBodyBytes, Boolean hasCookies, int requestTimeMs, String lastModifiedMaybe, String etagMaybe) {
this.crawlId = crawlId; this.crawlId = crawlId;
this.url = url; this.url = url;
this.contentType = contentType; this.contentType = contentType;
@@ -94,6 +95,7 @@ public final class CrawledDocument implements SerializableCrawlData {
this.documentBodyBytes = Objects.requireNonNullElse(documentBodyBytes, new byte[] {}); this.documentBodyBytes = Objects.requireNonNullElse(documentBodyBytes, new byte[] {});
this.hasCookies = hasCookies; this.hasCookies = hasCookies;
this.lastModifiedMaybe = lastModifiedMaybe; this.lastModifiedMaybe = lastModifiedMaybe;
this.requestTimeMs = requestTimeMs;
this.etagMaybe = etagMaybe; this.etagMaybe = etagMaybe;
} }
@@ -173,6 +175,7 @@ public final class CrawledDocument implements SerializableCrawlData {
private byte[] documentBodyBytes = new byte[0]; private byte[] documentBodyBytes = new byte[0];
private String recrawlState; private String recrawlState;
private Boolean hasCookies; private Boolean hasCookies;
private int requestTimeMs;
private String lastModifiedMaybe; private String lastModifiedMaybe;
private String etagMaybe; private String etagMaybe;
@@ -248,8 +251,13 @@ public final class CrawledDocument implements SerializableCrawlData {
return this; return this;
} }
public CrawledDocumentBuilder requestTimeMs(int requestTimeMs) {
this.requestTimeMs = requestTimeMs;
return this;
}
public CrawledDocument build() { public CrawledDocument build() {
return new CrawledDocument(this.crawlId, this.url, this.contentType, this.timestamp, this.httpStatus, this.crawlerStatus, this.crawlerStatusDesc, this.headers, this.documentBodyBytes, this.hasCookies, this.lastModifiedMaybe, this.etagMaybe); return new CrawledDocument(this.crawlId, this.url, this.contentType, this.timestamp, this.httpStatus, this.crawlerStatus, this.crawlerStatusDesc, this.headers, this.documentBodyBytes, this.hasCookies, this.requestTimeMs, this.lastModifiedMaybe, this.etagMaybe);
} }
public String toString() { public String toString() {

View File

@@ -9,6 +9,7 @@ import nu.marginalia.parquet.crawldata.CrawledDocumentParquetRecord;
import nu.marginalia.parquet.crawldata.CrawledDocumentParquetRecordFileReader; import nu.marginalia.parquet.crawldata.CrawledDocumentParquetRecordFileReader;
import nu.marginalia.slop.column.array.ByteArrayColumn; import nu.marginalia.slop.column.array.ByteArrayColumn;
import nu.marginalia.slop.column.primitive.ByteColumn; import nu.marginalia.slop.column.primitive.ByteColumn;
import nu.marginalia.slop.column.primitive.IntColumn;
import nu.marginalia.slop.column.primitive.LongColumn; import nu.marginalia.slop.column.primitive.LongColumn;
import nu.marginalia.slop.column.primitive.ShortColumn; import nu.marginalia.slop.column.primitive.ShortColumn;
import nu.marginalia.slop.column.string.EnumColumn; import nu.marginalia.slop.column.string.EnumColumn;
@@ -39,6 +40,7 @@ public record SlopCrawlDataRecord(String domain,
long timestamp, long timestamp,
String contentType, String contentType,
byte[] body, byte[] body,
int requestTimeMs,
String headers) String headers)
{ {
private static final EnumColumn domainColumn = new EnumColumn("domain", StandardCharsets.UTF_8, StorageType.ZSTD); private static final EnumColumn domainColumn = new EnumColumn("domain", StandardCharsets.UTF_8, StorageType.ZSTD);
@@ -49,6 +51,7 @@ public record SlopCrawlDataRecord(String domain,
private static final LongColumn timestampColumn = new LongColumn("timestamp"); private static final LongColumn timestampColumn = new LongColumn("timestamp");
private static final EnumColumn contentTypeColumn = new EnumColumn("contentType", StandardCharsets.UTF_8); private static final EnumColumn contentTypeColumn = new EnumColumn("contentType", StandardCharsets.UTF_8);
private static final ByteArrayColumn bodyColumn = new ByteArrayColumn("body", StorageType.ZSTD); private static final ByteArrayColumn bodyColumn = new ByteArrayColumn("body", StorageType.ZSTD);
private static final ShortColumn requestTimeColumn = new ShortColumn("requestTimeMs");
private static final StringColumn headerColumn = new StringColumn("header", StandardCharsets.UTF_8, StorageType.ZSTD); private static final StringColumn headerColumn = new StringColumn("header", StandardCharsets.UTF_8, StorageType.ZSTD);
public SlopCrawlDataRecord(CrawledDocumentParquetRecord parquetRecord) { public SlopCrawlDataRecord(CrawledDocumentParquetRecord parquetRecord) {
@@ -60,6 +63,7 @@ public record SlopCrawlDataRecord(String domain,
parquetRecord.timestamp.toEpochMilli(), parquetRecord.timestamp.toEpochMilli(),
parquetRecord.contentType, parquetRecord.contentType,
parquetRecord.body, parquetRecord.body,
-1,
parquetRecord.headers parquetRecord.headers
); );
} }
@@ -74,6 +78,7 @@ public record SlopCrawlDataRecord(String domain,
date.toEpochMilli(), date.toEpochMilli(),
"x-marginalia/advisory;state=redirect", "x-marginalia/advisory;state=redirect",
new byte[0], new byte[0],
-1,
"" ""
); );
} }
@@ -87,6 +92,7 @@ public record SlopCrawlDataRecord(String domain,
date.toEpochMilli(), date.toEpochMilli(),
"x-marginalia/advisory;state=error", "x-marginalia/advisory;state=error",
errorStatus.getBytes(), errorStatus.getBytes(),
-1,
"" ""
); );
} }
@@ -100,6 +106,7 @@ public record SlopCrawlDataRecord(String domain,
date.toEpochMilli(), date.toEpochMilli(),
errorStatus, errorStatus,
new byte[0], new byte[0],
-1,
"" ""
); );
} }
@@ -321,6 +328,7 @@ public record SlopCrawlDataRecord(String domain,
private final LongColumn.Writer timestampColumnWriter; private final LongColumn.Writer timestampColumnWriter;
private final EnumColumn.Writer contentTypeColumnWriter; private final EnumColumn.Writer contentTypeColumnWriter;
private final ByteArrayColumn.Writer bodyColumnWriter; private final ByteArrayColumn.Writer bodyColumnWriter;
private final ShortColumn.Writer requestTimeColumnWriter;
private final StringColumn.Writer headerColumnWriter; private final StringColumn.Writer headerColumnWriter;
public Writer(Path path) throws IOException { public Writer(Path path) throws IOException {
@@ -334,6 +342,7 @@ public record SlopCrawlDataRecord(String domain,
timestampColumnWriter = timestampColumn.create(this); timestampColumnWriter = timestampColumn.create(this);
contentTypeColumnWriter = contentTypeColumn.create(this); contentTypeColumnWriter = contentTypeColumn.create(this);
bodyColumnWriter = bodyColumn.create(this); bodyColumnWriter = bodyColumn.create(this);
requestTimeColumnWriter = requestTimeColumn.create(this);
headerColumnWriter = headerColumn.create(this); headerColumnWriter = headerColumn.create(this);
} }
@@ -346,6 +355,7 @@ public record SlopCrawlDataRecord(String domain,
timestampColumnWriter.put(record.timestamp); timestampColumnWriter.put(record.timestamp);
contentTypeColumnWriter.put(record.contentType); contentTypeColumnWriter.put(record.contentType);
bodyColumnWriter.put(record.body); bodyColumnWriter.put(record.body);
requestTimeColumnWriter.put((short) record.requestTimeMs);
headerColumnWriter.put(record.headers); headerColumnWriter.put(record.headers);
} }
@@ -391,10 +401,20 @@ public record SlopCrawlDataRecord(String domain,
String headersStr; String headersStr;
StringJoiner headersStrBuilder = new StringJoiner("\n"); StringJoiner headersStrBuilder = new StringJoiner("\n");
int requestTimeMs = -1;
for (var header : headers) { for (var header : headers) {
if (header.getName().equalsIgnoreCase("X-Cookies") && "1".equals(header.getValue())) { if (header.getName().equalsIgnoreCase("X-Cookies") && "1".equals(header.getValue())) {
hasCookies = true; hasCookies = true;
} }
if (header.getName().equals("X-Marginalia-Response-Time")) {
try {
requestTimeMs = Integer.parseInt(header.getValue());
}
catch (NumberFormatException ex) {
logger.warn("Failed to parse X-Marginalia-Response-Time header: {}", header.getValue());
}
continue;
}
headersStrBuilder.add(header.getName() + ": " + header.getValue()); headersStrBuilder.add(header.getName() + ": " + header.getValue());
} }
headersStr = headersStrBuilder.toString(); headersStr = headersStrBuilder.toString();
@@ -409,6 +429,7 @@ public record SlopCrawlDataRecord(String domain,
response.date().toEpochMilli(), response.date().toEpochMilli(),
contentType, contentType,
bodyBytes, bodyBytes,
requestTimeMs,
headersStr headersStr
) )
); );
@@ -461,6 +482,7 @@ public record SlopCrawlDataRecord(String domain,
private final LongColumn.Reader timestampColumnReader; private final LongColumn.Reader timestampColumnReader;
private final EnumColumn.Reader contentTypeColumnReader; private final EnumColumn.Reader contentTypeColumnReader;
private final ByteArrayColumn.Reader bodyColumnReader; private final ByteArrayColumn.Reader bodyColumnReader;
private final ShortColumn.Reader requestTimeColumnReader;
private final StringColumn.Reader headerColumnReader; private final StringColumn.Reader headerColumnReader;
public Reader(Path path) throws IOException { public Reader(Path path) throws IOException {
@@ -475,6 +497,17 @@ public record SlopCrawlDataRecord(String domain,
contentTypeColumnReader = contentTypeColumn.open(this); contentTypeColumnReader = contentTypeColumn.open(this);
bodyColumnReader = bodyColumn.open(this); bodyColumnReader = bodyColumn.open(this);
headerColumnReader = headerColumn.open(this); headerColumnReader = headerColumn.open(this);
// FIXME: After 2025-06-XX, we can remove this migration workaround
ShortColumn.Reader timeColumnReader;
try {
timeColumnReader = requestTimeColumn.open(this);
}
catch (Exception ex) {
// Migration workaround
timeColumnReader = null;
}
requestTimeColumnReader = timeColumnReader;
} }
public SlopCrawlDataRecord get() throws IOException { public SlopCrawlDataRecord get() throws IOException {
@@ -487,6 +520,7 @@ public record SlopCrawlDataRecord(String domain,
timestampColumnReader.get(), timestampColumnReader.get(),
contentTypeColumnReader.get(), contentTypeColumnReader.get(),
bodyColumnReader.get(), bodyColumnReader.get(),
requestTimeColumnReader != null ? requestTimeColumnReader.get() : -1,
headerColumnReader.get() headerColumnReader.get()
); );
} }
@@ -506,6 +540,7 @@ public record SlopCrawlDataRecord(String domain,
private final LongColumn.Reader timestampColumnReader; private final LongColumn.Reader timestampColumnReader;
private final EnumColumn.Reader contentTypeColumnReader; private final EnumColumn.Reader contentTypeColumnReader;
private final ByteArrayColumn.Reader bodyColumnReader; private final ByteArrayColumn.Reader bodyColumnReader;
private final ShortColumn.Reader requestTimeColumnReader;
private final StringColumn.Reader headerColumnReader; private final StringColumn.Reader headerColumnReader;
private SlopCrawlDataRecord next = null; private SlopCrawlDataRecord next = null;
@@ -522,6 +557,17 @@ public record SlopCrawlDataRecord(String domain,
contentTypeColumnReader = contentTypeColumn.open(this); contentTypeColumnReader = contentTypeColumn.open(this);
bodyColumnReader = bodyColumn.open(this); bodyColumnReader = bodyColumn.open(this);
headerColumnReader = headerColumn.open(this); headerColumnReader = headerColumn.open(this);
// FIXME: After 2025-06-XX, we can remove this migration workaround
ShortColumn.Reader timeColumnReader;
try {
timeColumnReader = requestTimeColumn.open(this);
}
catch (Exception ex) {
// Migration workaround
timeColumnReader = null;
}
requestTimeColumnReader = timeColumnReader;
} }
public abstract boolean filter(String url, int status, String contentType); public abstract boolean filter(String url, int status, String contentType);
@@ -548,6 +594,7 @@ public record SlopCrawlDataRecord(String domain,
boolean cookies = cookiesColumnReader.get() == 1; boolean cookies = cookiesColumnReader.get() == 1;
int status = statusColumnReader.get(); int status = statusColumnReader.get();
long timestamp = timestampColumnReader.get(); long timestamp = timestampColumnReader.get();
int requestTimeMs = requestTimeColumnReader != null ? requestTimeColumnReader.get() : -1;
String contentType = contentTypeColumnReader.get(); String contentType = contentTypeColumnReader.get();
LargeItem<byte[]> body = bodyColumnReader.getLarge(); LargeItem<byte[]> body = bodyColumnReader.getLarge();
@@ -555,7 +602,7 @@ public record SlopCrawlDataRecord(String domain,
if (filter(url, status, contentType)) { if (filter(url, status, contentType)) {
next = new SlopCrawlDataRecord( next = new SlopCrawlDataRecord(
domain, url, ip, cookies, status, timestamp, contentType, body.get(), headers.get() domain, url, ip, cookies, status, timestamp, contentType, body.get(), requestTimeMs, headers.get()
); );
return true; return true;
} }

View File

@@ -195,6 +195,7 @@ public class LiveCrawlDataSet implements AutoCloseable {
headers, headers,
body, body,
false, false,
-1,
"", "",
"" ""
)); ));

View File

@@ -7,6 +7,7 @@ import nu.marginalia.api.model.ApiSearchResultQueryDetails;
import nu.marginalia.api.model.ApiSearchResults; import nu.marginalia.api.model.ApiSearchResults;
import nu.marginalia.api.searchquery.QueryClient; import nu.marginalia.api.searchquery.QueryClient;
import nu.marginalia.api.searchquery.RpcQueryLimits; import nu.marginalia.api.searchquery.RpcQueryLimits;
import nu.marginalia.api.searchquery.model.query.NsfwFilterTier;
import nu.marginalia.api.searchquery.model.query.QueryParams; import nu.marginalia.api.searchquery.model.query.QueryParams;
import nu.marginalia.api.searchquery.model.query.SearchSetIdentifier; import nu.marginalia.api.searchquery.model.query.SearchSetIdentifier;
import nu.marginalia.api.searchquery.model.results.DecoratedSearchResultItem; import nu.marginalia.api.searchquery.model.results.DecoratedSearchResultItem;
@@ -29,9 +30,10 @@ public class ApiSearchOperator {
public ApiSearchResults query(String query, public ApiSearchResults query(String query,
int count, int count,
int index) int index,
NsfwFilterTier filterTier)
{ {
var rsp = queryClient.search(createParams(query, count, index)); var rsp = queryClient.search(createParams(query, count, index, filterTier));
return new ApiSearchResults("RESTRICTED", query, return new ApiSearchResults("RESTRICTED", query,
rsp.results() rsp.results()
@@ -42,7 +44,7 @@ public class ApiSearchOperator {
.collect(Collectors.toList())); .collect(Collectors.toList()));
} }
private QueryParams createParams(String query, int count, int index) { private QueryParams createParams(String query, int count, int index, NsfwFilterTier filterTirer) {
SearchSetIdentifier searchSet = selectSearchSet(index); SearchSetIdentifier searchSet = selectSearchSet(index);
return new QueryParams( return new QueryParams(
@@ -53,7 +55,8 @@ public class ApiSearchOperator {
.setTimeoutMs(150) .setTimeoutMs(150)
.setFetchSize(8192) .setFetchSize(8192)
.build(), .build(),
searchSet.name()); searchSet.name(),
filterTirer);
} }
private SearchSetIdentifier selectSearchSet(int index) { private SearchSetIdentifier selectSearchSet(int index) {

View File

@@ -6,6 +6,7 @@ import io.prometheus.client.Counter;
import io.prometheus.client.Histogram; import io.prometheus.client.Histogram;
import nu.marginalia.api.model.ApiLicense; import nu.marginalia.api.model.ApiLicense;
import nu.marginalia.api.model.ApiSearchResults; import nu.marginalia.api.model.ApiSearchResults;
import nu.marginalia.api.searchquery.model.query.NsfwFilterTier;
import nu.marginalia.api.svc.LicenseService; import nu.marginalia.api.svc.LicenseService;
import nu.marginalia.api.svc.RateLimiterService; import nu.marginalia.api.svc.RateLimiterService;
import nu.marginalia.api.svc.ResponseCache; import nu.marginalia.api.svc.ResponseCache;
@@ -119,6 +120,16 @@ public class ApiService extends SparkService {
int count = intParam(request, "count", 20); int count = intParam(request, "count", 20);
int index = intParam(request, "index", 3); int index = intParam(request, "index", 3);
int nsfw = intParam(request, "nsfw", 1);
NsfwFilterTier nsfwFilterTier;
try {
nsfwFilterTier = NsfwFilterTier.fromCodedValue(nsfw);
}
catch (IllegalArgumentException e) {
Spark.halt(400, "Invalid nsfw parameter value");
return null; // Unreachable, but required to satisfy the compiler
}
logger.info(queryMarker, "{} Search {}", license.key, query); logger.info(queryMarker, "{} Search {}", license.key, query);
@@ -126,7 +137,7 @@ public class ApiService extends SparkService {
.labels(license.key) .labels(license.key)
.time(() -> .time(() ->
searchOperator searchOperator
.query(query, count, index) .query(query, count, index, nsfwFilterTier)
.withLicense(license.getLicense()) .withLicense(license.getLicense())
); );
} }

View File

@@ -2,6 +2,7 @@ package nu.marginalia.search;
import nu.marginalia.api.searchquery.RpcQueryLimits; import nu.marginalia.api.searchquery.RpcQueryLimits;
import nu.marginalia.api.searchquery.RpcTemporalBias; import nu.marginalia.api.searchquery.RpcTemporalBias;
import nu.marginalia.api.searchquery.model.query.NsfwFilterTier;
import nu.marginalia.api.searchquery.model.query.QueryParams; import nu.marginalia.api.searchquery.model.query.QueryParams;
import nu.marginalia.api.searchquery.model.query.SearchQuery; import nu.marginalia.api.searchquery.model.query.SearchQuery;
import nu.marginalia.api.searchquery.model.query.SearchSetIdentifier; import nu.marginalia.api.searchquery.model.query.SearchSetIdentifier;
@@ -52,6 +53,7 @@ public class SearchQueryParamFactory {
profile.searchSetIdentifier.name(), profile.searchSetIdentifier.name(),
userParams.strategy(), userParams.strategy(),
userParams.temporalBias(), userParams.temporalBias(),
userParams.filterTier(),
userParams.page() userParams.page()
); );
@@ -78,6 +80,7 @@ public class SearchQueryParamFactory {
SearchSetIdentifier.NONE.name(), SearchSetIdentifier.NONE.name(),
QueryStrategy.AUTO, QueryStrategy.AUTO,
RpcTemporalBias.Bias.NONE, RpcTemporalBias.Bias.NONE,
NsfwFilterTier.OFF,
1 1
); );
} }
@@ -98,6 +101,7 @@ public class SearchQueryParamFactory {
SearchSetIdentifier.NONE.name(), SearchSetIdentifier.NONE.name(),
QueryStrategy.AUTO, QueryStrategy.AUTO,
RpcTemporalBias.Bias.NONE, RpcTemporalBias.Bias.NONE,
NsfwFilterTier.DANGER,
1 1
); );
} }
@@ -118,6 +122,7 @@ public class SearchQueryParamFactory {
SearchSetIdentifier.NONE.name(), SearchSetIdentifier.NONE.name(),
QueryStrategy.AUTO, QueryStrategy.AUTO,
RpcTemporalBias.Bias.NONE, RpcTemporalBias.Bias.NONE,
NsfwFilterTier.DANGER,
1 1
); );
} }

View File

@@ -2,6 +2,7 @@ package nu.marginalia.search.command;
import nu.marginalia.WebsiteUrl; import nu.marginalia.WebsiteUrl;
import nu.marginalia.api.searchquery.RpcTemporalBias; import nu.marginalia.api.searchquery.RpcTemporalBias;
import nu.marginalia.api.searchquery.model.query.NsfwFilterTier;
import nu.marginalia.index.query.limit.QueryStrategy; import nu.marginalia.index.query.limit.QueryStrategy;
import nu.marginalia.index.query.limit.SpecificationLimit; import nu.marginalia.index.query.limit.SpecificationLimit;
import nu.marginalia.search.model.SearchProfile; import nu.marginalia.search.model.SearchProfile;
@@ -23,6 +24,10 @@ public record SearchParameters(String query,
int page int page
) { ) {
public NsfwFilterTier filterTier() {
return NsfwFilterTier.DANGER;
}
public SearchParameters(String queryString, Request request) { public SearchParameters(String queryString, Request request) {
this( this(
queryString, queryString,

View File

@@ -61,7 +61,7 @@ public class UrlDeduplicator {
private boolean limitResultsPerDomain(DecoratedSearchResultItem details) { private boolean limitResultsPerDomain(DecoratedSearchResultItem details) {
final var domain = details.getUrl().getDomain(); final var domain = details.getUrl().getDomain();
final String key = domain.getDomainKey(); final String key = domain.toString();
return keyCount.adjustOrPutValue(key, 1, 1) <= resultsPerKey; return keyCount.adjustOrPutValue(key, 1, 1) <= resultsPerKey;
} }

View File

@@ -112,13 +112,6 @@ public class SearchOperator {
.selectStrategy(queryResponse) .selectStrategy(queryResponse)
.clusterResults(queryResults, 25); .clusterResults(queryResults, 25);
if (queryParams.humanQuery().equals("slackware linux")) {
logger.info("Query response: {}", queryResponse.results().subList(0, 5));
logger.info("Query results: {}", queryResults.subList(0, 5));
logger.info("Clustered results: {}", clusteredResults.subList(0, 5));
}
// Log the query and results // Log the query and results
logger.info(queryMarker, "Human terms: {}", Strings.join(queryResponse.searchTermsHuman(), ',')); logger.info(queryMarker, "Human terms: {}", Strings.join(queryResponse.searchTermsHuman(), ','));

View File

@@ -2,6 +2,7 @@ package nu.marginalia.search;
import nu.marginalia.api.searchquery.RpcQueryLimits; import nu.marginalia.api.searchquery.RpcQueryLimits;
import nu.marginalia.api.searchquery.RpcTemporalBias; import nu.marginalia.api.searchquery.RpcTemporalBias;
import nu.marginalia.api.searchquery.model.query.NsfwFilterTier;
import nu.marginalia.api.searchquery.model.query.QueryParams; import nu.marginalia.api.searchquery.model.query.QueryParams;
import nu.marginalia.api.searchquery.model.query.SearchQuery; import nu.marginalia.api.searchquery.model.query.SearchQuery;
import nu.marginalia.api.searchquery.model.query.SearchSetIdentifier; import nu.marginalia.api.searchquery.model.query.SearchSetIdentifier;
@@ -53,6 +54,7 @@ public class SearchQueryParamFactory {
profile.searchSetIdentifier.name(), profile.searchSetIdentifier.name(),
userParams.strategy(), userParams.strategy(),
userParams.temporalBias(), userParams.temporalBias(),
userParams.filterTier(),
userParams.page() userParams.page()
); );
@@ -79,6 +81,7 @@ public class SearchQueryParamFactory {
SearchSetIdentifier.NONE.name(), SearchSetIdentifier.NONE.name(),
QueryStrategy.AUTO, QueryStrategy.AUTO,
RpcTemporalBias.Bias.NONE, RpcTemporalBias.Bias.NONE,
NsfwFilterTier.OFF,
page page
); );
} }
@@ -99,6 +102,7 @@ public class SearchQueryParamFactory {
SearchSetIdentifier.NONE.name(), SearchSetIdentifier.NONE.name(),
QueryStrategy.AUTO, QueryStrategy.AUTO,
RpcTemporalBias.Bias.NONE, RpcTemporalBias.Bias.NONE,
NsfwFilterTier.DANGER,
page page
); );
} }
@@ -119,6 +123,7 @@ public class SearchQueryParamFactory {
SearchSetIdentifier.NONE.name(), SearchSetIdentifier.NONE.name(),
QueryStrategy.AUTO, QueryStrategy.AUTO,
RpcTemporalBias.Bias.NONE, RpcTemporalBias.Bias.NONE,
NsfwFilterTier.DANGER,
1 1
); );
} }

View File

@@ -18,6 +18,7 @@ import nu.marginalia.service.server.JoobyService;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import java.nio.charset.StandardCharsets;
import java.util.List; import java.util.List;
import java.util.NoSuchElementException; import java.util.NoSuchElementException;
@@ -41,6 +42,8 @@ public class SearchService extends JoobyService {
.help("Search service error count") .help("Search service error count")
.register(); .register();
private final String openSearchXML;
@Inject @Inject
public SearchService(BaseServiceParams params, public SearchService(BaseServiceParams params,
WebsiteUrl websiteUrl, WebsiteUrl websiteUrl,
@@ -69,6 +72,13 @@ public class SearchService extends JoobyService {
this.siteSubscriptionService = siteSubscriptionService; this.siteSubscriptionService = siteSubscriptionService;
this.faviconClient = faviconClient; this.faviconClient = faviconClient;
this.domainQueries = domainQueries; this.domainQueries = domainQueries;
try (var is = ClassLoader.getSystemResourceAsStream("static/opensearch.xml")) {
openSearchXML = new String(is.readAllBytes(), StandardCharsets.UTF_8);
}
catch (Exception e) {
throw new RuntimeException("Failed to load OpenSearch XML", e);
}
} }
@Override @Override
@@ -82,6 +92,11 @@ public class SearchService extends JoobyService {
jooby.get("/site/https://*", this::handleSiteUrlRedirect); jooby.get("/site/https://*", this::handleSiteUrlRedirect);
jooby.get("/site/http://*", this::handleSiteUrlRedirect); jooby.get("/site/http://*", this::handleSiteUrlRedirect);
jooby.get("/opensearch.xml", ctx -> {
ctx.setResponseType(MediaType.valueOf("application/opensearchdescription+xml"));
return openSearchXML;
});
String emptySvg = "<svg xmlns=\"http://www.w3.org/2000/svg\"></svg>"; String emptySvg = "<svg xmlns=\"http://www.w3.org/2000/svg\"></svg>";
jooby.get("/site/{domain}/favicon", ctx -> { jooby.get("/site/{domain}/favicon", ctx -> {
String domain = ctx.path("domain").value(); String domain = ctx.path("domain").value();

View File

@@ -2,6 +2,7 @@ package nu.marginalia.search.command;
import nu.marginalia.WebsiteUrl; import nu.marginalia.WebsiteUrl;
import nu.marginalia.api.searchquery.RpcTemporalBias; import nu.marginalia.api.searchquery.RpcTemporalBias;
import nu.marginalia.api.searchquery.model.query.NsfwFilterTier;
import nu.marginalia.index.query.limit.QueryStrategy; import nu.marginalia.index.query.limit.QueryStrategy;
import nu.marginalia.index.query.limit.SpecificationLimit; import nu.marginalia.index.query.limit.SpecificationLimit;
import nu.marginalia.model.EdgeDomain; import nu.marginalia.model.EdgeDomain;
@@ -24,6 +25,10 @@ public record SearchParameters(WebsiteUrl url,
int page int page
) { ) {
public NsfwFilterTier filterTier() {
return NsfwFilterTier.DANGER;
}
public static SearchParameters defaultsForQuery(WebsiteUrl url, String query, int page) { public static SearchParameters defaultsForQuery(WebsiteUrl url, String query, int page) {
return new SearchParameters( return new SearchParameters(
url, url,

View File

@@ -25,28 +25,14 @@ public class UrlDeduplicator {
} }
public boolean shouldRemove(DecoratedSearchResultItem details) { public boolean shouldRemove(DecoratedSearchResultItem details) {
if (details.url.domain.topDomain.equals("slackware.com")) {
if (!deduplicateOnSuperficialHash(details)) {
logger.info("Rejecting on superficial hash " + details.url);
return true;
}
if (!deduplicateOnLSH(details)) {
logger.info("Rejecting on LSH for " + details.url);
return true;
}
if (!limitResultsPerDomain(details)) {
logger.info("Rejecting on limitResultsPerDomain for " + details.url);
return true;
}
}
else {
if (!deduplicateOnSuperficialHash(details)) if (!deduplicateOnSuperficialHash(details))
return true; return true;
if (!deduplicateOnLSH(details)) if (!deduplicateOnLSH(details))
return true; return true;
if (!limitResultsPerDomain(details)) if (!limitResultsPerDomain(details))
return true; return true;
}
return false; return false;
} }
@@ -76,7 +62,7 @@ public class UrlDeduplicator {
private boolean limitResultsPerDomain(DecoratedSearchResultItem details) { private boolean limitResultsPerDomain(DecoratedSearchResultItem details) {
final var domain = details.getUrl().getDomain(); final var domain = details.getUrl().getDomain();
final String key = domain.getDomainKey(); final String key = domain.toString();
return keyCount.adjustOrPutValue(key, 1, 1) <= resultsPerKey; return keyCount.adjustOrPutValue(key, 1, 1) <= resultsPerKey;
} }

View File

@@ -5,6 +5,7 @@ import com.google.inject.Inject;
import io.jooby.Context; import io.jooby.Context;
import io.jooby.Jooby; import io.jooby.Jooby;
import nu.marginalia.assistant.suggest.Suggestions; import nu.marginalia.assistant.suggest.Suggestions;
import nu.marginalia.domsample.DomSampleService;
import nu.marginalia.functions.domains.DomainInfoGrpcService; import nu.marginalia.functions.domains.DomainInfoGrpcService;
import nu.marginalia.functions.math.MathGrpcService; import nu.marginalia.functions.math.MathGrpcService;
import nu.marginalia.livecapture.LiveCaptureGrpcService; import nu.marginalia.livecapture.LiveCaptureGrpcService;
@@ -30,6 +31,7 @@ public class AssistantService extends JoobyService {
ScreenshotService screenshotService, ScreenshotService screenshotService,
DomainInfoGrpcService domainInfoGrpcService, DomainInfoGrpcService domainInfoGrpcService,
LiveCaptureGrpcService liveCaptureGrpcService, LiveCaptureGrpcService liveCaptureGrpcService,
DomSampleService domSampleService,
FeedsGrpcService feedsGrpcService, FeedsGrpcService feedsGrpcService,
MathGrpcService mathGrpcService, MathGrpcService mathGrpcService,
Suggestions suggestions) Suggestions suggestions)
@@ -41,10 +43,11 @@ public class AssistantService extends JoobyService {
liveCaptureGrpcService, liveCaptureGrpcService,
feedsGrpcService), feedsGrpcService),
List.of()); List.of());
this.screenshotService = screenshotService;
this.screenshotService = screenshotService;
this.suggestions = suggestions; this.suggestions = suggestions;
domSampleService.start();
} }
public void startJooby(Jooby jooby) { public void startJooby(Jooby jooby) {

View File

@@ -3,6 +3,7 @@ package nu.marginalia.control.app.svc;
import com.google.inject.Inject; import com.google.inject.Inject;
import nu.marginalia.api.searchquery.QueryClient; import nu.marginalia.api.searchquery.QueryClient;
import nu.marginalia.api.searchquery.RpcQueryLimits; import nu.marginalia.api.searchquery.RpcQueryLimits;
import nu.marginalia.api.searchquery.model.query.NsfwFilterTier;
import nu.marginalia.api.searchquery.model.query.QueryParams; import nu.marginalia.api.searchquery.model.query.QueryParams;
import nu.marginalia.control.ControlRendererFactory; import nu.marginalia.control.ControlRendererFactory;
import nu.marginalia.model.EdgeUrl; import nu.marginalia.model.EdgeUrl;
@@ -81,7 +82,8 @@ public class SearchToBanService {
.setFetchSize(8192) .setFetchSize(8192)
.build() .build()
, ,
"NONE" "NONE",
NsfwFilterTier.OFF
)); ));
} }
} }

View File

@@ -44,6 +44,7 @@ dependencies {
implementation project(':code:functions:link-graph:api') implementation project(':code:functions:link-graph:api')
implementation project(':code:functions:favicon') implementation project(':code:functions:favicon')
implementation project(':code:functions:favicon:api') implementation project(':code:functions:favicon:api')
implementation project(':code:functions:nsfw-domain-filter')
implementation project(':code:processes:crawling-process:model') implementation project(':code:processes:crawling-process:model')
implementation project(':code:processes:crawling-process:model') implementation project(':code:processes:crawling-process:model')

View File

@@ -3,13 +3,14 @@ package nu.marginalia.executor;
import com.google.inject.Guice; import com.google.inject.Guice;
import com.google.inject.Inject; import com.google.inject.Inject;
import com.google.inject.Injector; import com.google.inject.Injector;
import nu.marginalia.nsfw.NsfwFilterModule;
import nu.marginalia.service.MainClass; import nu.marginalia.service.MainClass;
import nu.marginalia.service.discovery.ServiceRegistryIf;
import nu.marginalia.service.module.ServiceConfiguration;
import nu.marginalia.service.module.ServiceDiscoveryModule;
import nu.marginalia.service.ServiceId; import nu.marginalia.service.ServiceId;
import nu.marginalia.service.discovery.ServiceRegistryIf;
import nu.marginalia.service.module.DatabaseModule; import nu.marginalia.service.module.DatabaseModule;
import nu.marginalia.service.module.ServiceConfiguration;
import nu.marginalia.service.module.ServiceConfigurationModule; import nu.marginalia.service.module.ServiceConfigurationModule;
import nu.marginalia.service.module.ServiceDiscoveryModule;
import nu.marginalia.service.server.Initialization; import nu.marginalia.service.server.Initialization;
import nu.marginalia.service.server.NodeStatusWatcher; import nu.marginalia.service.server.NodeStatusWatcher;
@@ -27,6 +28,7 @@ public class ExecutorMain extends MainClass {
Injector injector = Guice.createInjector( Injector injector = Guice.createInjector(
new ExecutorModule(), new ExecutorModule(),
new DatabaseModule(false), new DatabaseModule(false),
new NsfwFilterModule(),
new ServiceDiscoveryModule(), new ServiceDiscoveryModule(),
new ServiceConfigurationModule(ServiceId.Executor) new ServiceConfigurationModule(ServiceId.Executor)
); );

View File

@@ -37,6 +37,7 @@ dependencies {
implementation project(':code:functions:search-query:api') implementation project(':code:functions:search-query:api')
implementation project(':code:functions:link-graph:api') implementation project(':code:functions:link-graph:api')
implementation project(':code:functions:link-graph:aggregate') implementation project(':code:functions:link-graph:aggregate')
implementation project(':code:functions:nsfw-domain-filter')
implementation libs.bundles.slf4j implementation libs.bundles.slf4j

View File

@@ -6,6 +6,7 @@ import com.google.inject.Inject;
import nu.marginalia.api.searchquery.RpcQueryLimits; import nu.marginalia.api.searchquery.RpcQueryLimits;
import nu.marginalia.api.searchquery.RpcResultRankingParameters; import nu.marginalia.api.searchquery.RpcResultRankingParameters;
import nu.marginalia.api.searchquery.RpcTemporalBias; import nu.marginalia.api.searchquery.RpcTemporalBias;
import nu.marginalia.api.searchquery.model.query.NsfwFilterTier;
import nu.marginalia.api.searchquery.model.query.QueryParams; import nu.marginalia.api.searchquery.model.query.QueryParams;
import nu.marginalia.api.searchquery.model.results.PrototypeRankingParameters; import nu.marginalia.api.searchquery.model.results.PrototypeRankingParameters;
import nu.marginalia.functions.searchquery.QueryGRPCService; import nu.marginalia.functions.searchquery.QueryGRPCService;
@@ -61,7 +62,7 @@ public class QueryBasicInterface {
.setTimeoutMs(250) .setTimeoutMs(250)
.setFetchSize(8192) .setFetchSize(8192)
.build() .build()
, set); , set, NsfwFilterTier.OFF);
var pagination = new IndexClient.Pagination(page, count); var pagination = new IndexClient.Pagination(page, count);
@@ -114,7 +115,7 @@ public class QueryBasicInterface {
.setTimeoutMs(250) .setTimeoutMs(250)
.setFetchSize(8192) .setFetchSize(8192)
.build(), .build(),
set); set, NsfwFilterTier.OFF);
var pagination = new IndexClient.Pagination(page, count); var pagination = new IndexClient.Pagination(page, count);

View File

@@ -3,13 +3,14 @@ package nu.marginalia.query;
import com.google.inject.Guice; import com.google.inject.Guice;
import com.google.inject.Inject; import com.google.inject.Inject;
import com.google.inject.Injector; import com.google.inject.Injector;
import nu.marginalia.nsfw.NsfwFilterModule;
import nu.marginalia.service.MainClass; import nu.marginalia.service.MainClass;
import nu.marginalia.service.discovery.ServiceRegistryIf;
import nu.marginalia.service.module.ServiceConfiguration;
import nu.marginalia.service.module.ServiceDiscoveryModule;
import nu.marginalia.service.ServiceId; import nu.marginalia.service.ServiceId;
import nu.marginalia.service.module.ServiceConfigurationModule; import nu.marginalia.service.discovery.ServiceRegistryIf;
import nu.marginalia.service.module.DatabaseModule; import nu.marginalia.service.module.DatabaseModule;
import nu.marginalia.service.module.ServiceConfiguration;
import nu.marginalia.service.module.ServiceConfigurationModule;
import nu.marginalia.service.module.ServiceDiscoveryModule;
import nu.marginalia.service.server.Initialization; import nu.marginalia.service.server.Initialization;
public class QueryMain extends MainClass { public class QueryMain extends MainClass {
@@ -26,6 +27,7 @@ public class QueryMain extends MainClass {
Injector injector = Guice.createInjector( Injector injector = Guice.createInjector(
new QueryModule(), new QueryModule(),
new DatabaseModule(false), new DatabaseModule(false),
new NsfwFilterModule(),
new ServiceDiscoveryModule(), new ServiceDiscoveryModule(),
new ServiceConfigurationModule(ServiceId.Query) new ServiceConfigurationModule(ServiceId.Query)
); );

View File

@@ -0,0 +1,3 @@
FROM ghcr.io/browserless/chromium:latest
COPY extension/ /dom-export

View File

@@ -0,0 +1,45 @@
plugins {
id 'base'
}
def imageName = 'marginalia-browserless'
def imageTag = project.hasProperty('imageTag') ? project.getProperty('imageTag') : 'latest'
tasks.register('docker', Exec) {
group = 'Docker'
description = 'Builds a Docker image using the Dockerfile in project root'
workingDir = projectDir
// Build the Docker command
commandLine 'docker', 'build',
'-t', "${imageName}:${imageTag}",
'-f', 'Dockerfile',
'--pull',
'--build-arg', "BASE_DIR=.",
'.'
// Add optional parameters if specified
if (project.hasProperty('noCache') && project.getProperty('noCache').toBoolean()) {
commandLine += '--no-cache'
}
doFirst {
println "Building Docker image '${imageName}:${imageTag}'..."
}
doLast {
println "Docker image '${imageName}:${imageTag}' has been built successfully."
}
}
// Add task to ensure the extension folder is included in the Docker context
tasks.register('prepareExtension', Copy) {
from 'extension'
into "${buildDir}/docker/extension"
}
// Make the docker task depend on prepareExtension
tasks.named('docker').configure {
dependsOn 'prepareExtension'
}

View File

@@ -0,0 +1,32 @@
// Listen to web requests and buffer them until the content script is ready
chrome.webRequest.onBeforeRequest.addListener(
(details) => {
const requestData = {
url: details.url,
method: details.method,
timestamp: Date.now()
};
console.log(requestData);
chrome.tabs.sendMessage(details.tabId, {
type: 'URL_INTERCEPTED',
...requestData
});
},
{ urls: ["<all_urls>"] }
);
// Listen to web navigation events and re-register content scripts when a page is reloaded or navigated to a new subframe
chrome.webNavigation.onCommitted.addListener(function(details) {
if (details.transitionType === 'reload' || details.transitionType === 'auto_subframe') {
chrome.scripting.registerContentScripts([{
id: "content-script",
matches : [ "<all_urls>" ],
js : [ "content.js" ]
}]);
}
});

View File

@@ -0,0 +1,646 @@
// This script runs in the context of web pages loaded by the browser extension
// Listen to messages from the background script
var networkRequests = document.createElement('div')
networkRequests.setAttribute('id', 'marginalia-network-requests');
chrome.runtime.onMessage.addListener((message, sender, sendResponse) => {
if (message.type === 'URL_INTERCEPTED') {
var request = document.createElement('div');
request.setAttribute('class', 'network-request');
request.setAttribute('data-url', message.url);
request.setAttribute('data-method', message.method);
request.setAttribute('data-timestamp', message.timestamp);
networkRequests.appendChild(request)
}
});
// Function to add styles as data attributes based on specified properties
function addStylesAsDataAttributes(propertyToAttrMap = {
'display': 'data-display',
'position': 'data-position',
'visibility': 'data-visibility',
}) {
const targetedProperties = new Set(Object.keys(propertyToAttrMap).map(prop => prop.toLowerCase()));
const styleSheets = Array.from(document.styleSheets);
try {
styleSheets.forEach(styleSheet => {
try {
if (styleSheet.href && new URL(styleSheet.href).origin !== window.location.origin) {
console.warn(`Skipping cross-origin stylesheet: ${styleSheet.href}`);
return;
}
const cssRules = styleSheet.cssRules || styleSheet.rules;
if (!cssRules) return;
for (let i = 0; i < cssRules.length; i++) {
const rule = cssRules[i];
if (rule.type === 1) {
try {
let containsTargetedProperty = false;
for (let j = 0; j < rule.style.length; j++) {
const property = rule.style[j].toLowerCase();
if (targetedProperties.has(property)) {
containsTargetedProperty = true;
break;
}
}
if (!containsTargetedProperty) continue;
const elements = document.querySelectorAll(rule.selectorText);
elements.forEach(element => {
for (let j = 0; j < rule.style.length; j++) {
const property = rule.style[j].toLowerCase();
if (targetedProperties.has(property)) {
const value = rule.style.getPropertyValue(property);
const dataAttrName = propertyToAttrMap[property];
element.setAttribute(dataAttrName, value);
}
}
});
} catch (selectorError) {
console.error(`Error processing selector "${rule.selectorText}": ${selectorError.message}`);
}
}
}
} catch (sheetError) {
console.error(`Error processing stylesheet: ${sheetError.message}`);
}
});
} catch (error) {
console.error(`Error adding data attributes: ${error.message}`);
}
}
class CookieConsentHandler {
constructor() {
// Keywords that strongly indicate cookie consent
this.cookieKeywords = [
'cookie', 'cookies', 'consent', 'gdpr', 'privacy policy', 'privacy notice',
'data protection', 'tracking', 'analytics', 'personalization', 'advertising',
'essential cookies', 'functional cookies', 'performance cookies'
];
// Keywords that indicate newsletter/subscription popups
this.newsletterKeywords = [
'newsletter', 'subscribe', 'email', 'signup', 'sign up', 'updates',
'notifications', 'discount', 'offer', 'deal', 'promo', 'exclusive'
];
// Common button text for accepting cookies
this.acceptButtonTexts = [
'accept', 'accept all', 'allow all', 'agree', 'ok', 'got it',
'i agree', 'continue', 'yes', 'enable', 'allow cookies',
'accept cookies', 'accept all cookies', 'i understand'
];
// Common button text for rejecting (to avoid clicking these)
this.rejectButtonTexts = [
'reject', 'decline', 'deny', 'refuse', 'no thanks', 'no',
'reject all', 'decline all', 'manage preferences', 'customize',
'settings', 'options', 'learn more'
];
// Special patterns that strongly indicate cookie consent
this.acceptButtonStyles = [
/primary/,
];
}
analyzePopover(element) {
if (!element || !element.textContent) {
return { category: 'unknown', action: 'none', reason: 'Invalid element' };
}
const textContent = element.textContent.toLowerCase();
const category = this.categorizePopover(textContent, element);
let result = {
category: category,
action: 'none',
reason: '',
element: element
};
if (category === 'cookie_consent') {
const acceptResult = this.tryAcceptCookies(element);
result.action = acceptResult.action;
result.reason = acceptResult.reason;
result.buttonClicked = acceptResult.buttonClicked;
}
return result;
}
categorizePopover(textContent, element) {
let cookieScore = 0;
let newsletterScore = 0;
// Score based on keyword presence
this.cookieKeywords.forEach(keyword => {
if (textContent.includes(keyword)) {
cookieScore += keyword === 'cookie' || keyword === 'cookies' ? 3 : 1;
}
});
this.newsletterKeywords.forEach(keyword => {
if (textContent.includes(keyword)) {
newsletterScore += keyword === 'newsletter' || keyword === 'subscribe' ? 3 : 1;
}
});
// Additional heuristics
if (this.hasPrivacyPolicyLink(element)) cookieScore += 2;
if (this.hasManagePreferencesButton(element)) cookieScore += 2;
if (this.hasEmailInput(element)) newsletterScore += 3;
if (this.hasDiscountMention(textContent)) newsletterScore += 2;
// Special patterns that strongly indicate cookie consent
const strongCookiePatterns = [
/we use cookies/,
/this website uses cookies/,
/by continuing to use/,
/essential.*cookies/,
/improve.*experience/,
/gdpr/,
/data protection/
];
if (strongCookiePatterns.some(pattern => pattern.test(textContent))) {
cookieScore += 5;
}
// Determine category
if (cookieScore > newsletterScore && cookieScore >= 2) {
return 'cookie_consent';
} else if (newsletterScore > cookieScore && newsletterScore >= 2) {
return 'newsletter';
} else {
return 'other';
}
}
tryAcceptCookies(element) {
const buttons = this.findButtons(element);
if (buttons.length === 0) {
return { action: 'no_buttons_found', reason: 'No clickable buttons found' };
}
// First, try to find explicit accept buttons
const acceptButton = this.findAcceptButton(buttons);
if (acceptButton) {
try {
acceptButton.click();
return {
action: 'clicked_accept',
reason: 'Found and clicked accept button',
buttonClicked: acceptButton.textContent.trim()
};
} catch (error) {
return {
action: 'click_failed',
reason: `Failed to click button: ${error.message}`,
buttonClicked: acceptButton.textContent.trim()
};
}
}
// If no explicit accept button, try to find the most likely candidate
const likelyButton = this.findMostLikelyAcceptButton(buttons);
if (likelyButton) {
try {
likelyButton.click();
return {
action: 'clicked_likely',
reason: 'Clicked most likely accept button',
buttonClicked: likelyButton.textContent.trim()
};
} catch (error) {
return {
action: 'click_failed',
reason: `Failed to click button: ${error.message}`,
buttonClicked: likelyButton.textContent.trim()
};
}
}
return {
action: 'no_accept_button',
reason: 'Could not identify accept button',
availableButtons: buttons.map(btn => btn.textContent.trim())
};
}
findButtons(element) {
const selectors = [
'button',
'input[type="button"]',
'input[type="submit"]',
'[role="button"]',
'a[href="#"]',
'.button',
'.btn',
'.btn-primary'
];
const buttons = [];
selectors.forEach(selector => {
const found = element.querySelectorAll(selector);
buttons.push(...Array.from(found));
});
// Remove duplicates and filter visible buttons
return [...new Set(buttons)].filter(btn =>
btn.offsetWidth > 0 && btn.offsetHeight > 0
);
}
findAcceptButton(buttons) {
var byClass = buttons.find(button => {
var classes = button.className.toLowerCase();
if (this.acceptButtonStyles.some(pattern => pattern.test(classes))) {
return true;
}
});
if (byClass != null) {
return byClass;
}
return buttons.find(button => {
const text = button.textContent.toLowerCase().trim();
return this.acceptButtonTexts.some(acceptText =>
text === acceptText || text.includes(acceptText)
) && !this.rejectButtonTexts.some(rejectText =>
text.includes(rejectText)
);
});
}
findMostLikelyAcceptButton(buttons) {
if (buttons.length === 1) {
const text = buttons[0].textContent.toLowerCase();
// If there's only one button and it's not explicitly a reject button, assume it's accept
if (!this.rejectButtonTexts.some(rejectText => text.includes(rejectText))) {
return buttons[0];
}
}
// Look for buttons with positive styling (often green, primary, etc.)
const positiveButton = buttons.find(button => {
const classes = button.className.toLowerCase();
const styles = window.getComputedStyle(button);
const bgColor = styles.backgroundColor;
return classes.includes('primary') ||
classes.includes('accept') ||
classes.includes('green') ||
bgColor.includes('rgb(0, 128, 0)') || // green variations
bgColor.includes('rgb(40, 167, 69)'); // bootstrap success
});
return positiveButton || null;
}
hasPrivacyPolicyLink(element) {
const links = element.querySelectorAll('a');
return Array.from(links).some(link =>
link.textContent.toLowerCase().includes('privacy') ||
link.href.toLowerCase().includes('privacy')
);
}
hasManagePreferencesButton(element) {
const buttons = this.findButtons(element);
return buttons.some(button => {
const text = button.textContent.toLowerCase();
return text.includes('manage') || text.includes('preferences') ||
text.includes('settings') || text.includes('customize');
});
}
hasEmailInput(element) {
const inputs = element.querySelectorAll('input[type="email"], input[placeholder*="email" i]');
return inputs.length > 0;
}
hasDiscountMention(textContent) {
const discountTerms = ['discount', 'off', '%', 'save', 'deal', 'offer'];
return discountTerms.some(term => textContent.includes(term));
}
}
var agreedToPopover = false;
// Usage example:
function handlePopover(popoverElement) {
const handler = new CookieConsentHandler();
const result = handler.analyzePopover(popoverElement);
console.log('Popover analysis result:', result);
switch (result.category) {
case 'cookie_consent':
console.log('Detected cookie consent popover');
if (result.action === 'clicked_accept') {
console.log('Successfully accepted cookies');
agreedToPopover = true;
} else {
console.log('Could not accept cookies:', result.reason);
}
break;
case 'newsletter':
console.log('Detected newsletter popover - no action taken');
break;
default:
console.log('Unknown popover type - no action taken');
}
return result;
}
function finalizeMarginaliaHack() {
addStylesAsDataAttributes();
// Find all likely popover elements
const fixedElements = document.querySelectorAll('[data-position="fixed"]');
// Attempt to agree to cookie consent popups
fixedElements.forEach(element => {
handlePopover(element);
});
// If we found a popover and agreed to it, add a notice
if (agreedToPopover) {
var notice = document.createElement('div');
notice.setAttribute('class', 'marginalia-agreed-cookies');
networkRequests.appendChild(notice);
}
var finalize = () => {
// Add a container for network requests
document.body.appendChild(networkRequests);
document.body.setAttribute('id', 'marginaliahack');
}
// If we have a popover and agreed to it, wait a bit before finalizing
// to let the ad networks load so we can capture their requests
if (agreedToPopover) {
setTimeout(finalize, 2500);
}
else {
finalize();
}
}
class EventSimulator {
constructor() {}
// Simulate smooth scrolling down the page
simulateScrollDown(duration = 2000, distance = null) {
return new Promise((resolve) => {
const startTime = Date.now();
const startScrollY = window.scrollY;
const maxScroll = document.documentElement.scrollHeight - window.innerHeight;
const targetDistance = distance || Math.min(window.innerHeight * 3, maxScroll - startScrollY);
if (targetDistance <= 0) {
resolve();
return;
}
const animate = () => {
const elapsed = Date.now() - startTime;
const progress = Math.min(elapsed / duration, 1);
// Ease-out function for smooth scrolling
const easeOut = 1 - Math.pow(1 - progress, 3);
const currentDistance = targetDistance * easeOut;
const newScrollY = startScrollY + currentDistance;
// Dispatch scroll events as we go
window.scrollTo(0, newScrollY);
// Fire custom scroll event
const scrollEvent = new Event('scroll', {
bubbles: true,
cancelable: true
});
// Add custom properties to track simulation
scrollEvent.simulated = true;
scrollEvent.scrollY = newScrollY;
scrollEvent.progress = progress;
window.dispatchEvent(scrollEvent);
document.dispatchEvent(scrollEvent);
if (progress < 1) {
requestAnimationFrame(animate);
} else {
resolve();
}
};
requestAnimationFrame(animate);
});
}
// Simulate mouse movement toward URL bar
simulateMouseToURLBar(duration = 1500) {
return new Promise((resolve) => {
const startTime = Date.now();
// Get current mouse position (or start from center of viewport)
const startX = window.innerWidth / 2;
const startY = window.innerHeight / 2;
// URL bar is typically at the top center of the browser
// Since we can't access actual browser chrome, we'll simulate movement
// toward the top of the viewport where the URL bar would be
const targetX = window.innerWidth / 2; // Center horizontally
const targetY = -50; // Above the viewport (simulating URL bar position)
const deltaX = targetX - startX;
const deltaY = targetY - startY;
let lastMouseEvent = null;
const animate = () => {
const elapsed = Date.now() - startTime;
const progress = Math.min(elapsed / duration, 1);
// Ease-in-out function for natural mouse movement
const easeInOut = progress < 0.5
? 2 * progress * progress
: 1 - Math.pow(-2 * progress + 2, 3) / 2;
const currentX = startX + (deltaX * easeInOut);
const currentY = startY + (deltaY * easeInOut);
// Create mouse move event
const mouseMoveEvent = new MouseEvent('mousemove', {
bubbles: true,
cancelable: true,
clientX: currentX,
clientY: currentY,
screenX: currentX,
screenY: currentY,
movementX: lastMouseEvent ? currentX - lastMouseEvent.clientX : 0,
movementY: lastMouseEvent ? currentY - lastMouseEvent.clientY : 0,
buttons: 0,
button: -1
});
// Add custom properties to track simulation
mouseMoveEvent.simulated = true;
mouseMoveEvent.progress = progress;
mouseMoveEvent.targetType = 'urlbar';
// Find element under mouse and dispatch event
const elementUnderMouse = document.elementFromPoint(currentX, currentY);
if (elementUnderMouse) {
elementUnderMouse.dispatchEvent(mouseMoveEvent);
// Also fire mouseenter/mouseleave events if element changed
if (lastMouseEvent) {
const lastElement = document.elementFromPoint(
lastMouseEvent.clientX,
lastMouseEvent.clientY
);
if (lastElement && lastElement !== elementUnderMouse) {
// Mouse left previous element
const mouseLeaveEvent = new MouseEvent('mouseleave', {
bubbles: false, // mouseleave doesn't bubble
cancelable: true,
clientX: currentX,
clientY: currentY,
relatedTarget: elementUnderMouse
});
mouseLeaveEvent.simulated = true;
lastElement.dispatchEvent(mouseLeaveEvent);
// Mouse entered new element
const mouseEnterEvent = new MouseEvent('mouseenter', {
bubbles: false, // mouseenter doesn't bubble
cancelable: true,
clientX: currentX,
clientY: currentY,
relatedTarget: lastElement
});
mouseEnterEvent.simulated = true;
elementUnderMouse.dispatchEvent(mouseEnterEvent);
}
}
}
// Also dispatch on document and window
document.dispatchEvent(mouseMoveEvent);
window.dispatchEvent(mouseMoveEvent);
lastMouseEvent = mouseMoveEvent;
if (progress < 1) {
requestAnimationFrame(animate);
} else {
resolve();
}
};
requestAnimationFrame(animate);
});
}
// Simulate realistic mouse movement with slight randomness
simulateNaturalMouseMovement(targetX, targetY, duration = 1000) {
return new Promise((resolve) => {
const startTime = Date.now();
const startX = window.innerWidth / 2;
const startY = window.innerHeight / 2;
const basePathX = targetX - startX;
const basePathY = targetY - startY;
const animate = () => {
const elapsed = Date.now() - startTime;
const progress = Math.min(elapsed / duration, 1);
// Add some randomness to make movement more natural
const randomOffsetX = (Math.random() - 0.5) * 10 * (1 - progress);
const randomOffsetY = (Math.random() - 0.5) * 10 * (1 - progress);
// Bezier curve for more natural movement
const t = progress;
const bezierProgress = t * t * (3.0 - 2.0 * t);
const currentX = startX + (basePathX * bezierProgress) + randomOffsetX;
const currentY = startY + (basePathY * bezierProgress) + randomOffsetY;
const mouseMoveEvent = new MouseEvent('mousemove', {
bubbles: true,
cancelable: true,
clientX: currentX,
clientY: currentY,
screenX: currentX,
screenY: currentY
});
mouseMoveEvent.simulated = true;
mouseMoveEvent.natural = true;
document.dispatchEvent(mouseMoveEvent);
if (progress < 1) {
requestAnimationFrame(animate);
} else {
resolve();
}
};
requestAnimationFrame(animate);
});
}
// Combined simulation: scroll down while moving mouse toward URL bar
async simulateBrowsingBehavior() {
// Start both animations simultaneously
const scrollPromise = this.simulateScrollDown(300);
const mousePromise = this.simulateMouseToURLBar(200);
// Wait for both to complete
await Promise.all([scrollPromise, mousePromise]);
// Add a small pause
await new Promise(resolve => setTimeout(resolve, 100));
// Simulate some additional natural mouse movement
await this.simulateNaturalMouseMovement(
window.innerWidth * 0.3,
window.innerHeight * 0.1,
100
);
console.log('Browsing behavior simulation completed');
}
}
// Usage examples:
const simulator = new EventSimulator();
function simulateUserBehavior() {
simulator.simulateBrowsingBehavior().then(() => {
console.log('User behavior simulation finished');
});
}
window.addEventListener("load", (e) => simulateUserBehavior());
window.addEventListener("load", (e) => setTimeout(finalizeMarginaliaHack, 2000));

View File

@@ -0,0 +1,29 @@
{
"manifest_version": 3,
"name": "Marginalia DOM Interceptor",
"version": "1.0",
"description": "Makes DOM export better",
"permissions": [
"activeTab",
"scripting",
"webNavigation",
"webRequest"
],
"host_permissions": [
"<all_urls>"
],
"background": {
"service_worker": "background.js",
"type": "module"
},
"content_scripts": [
{
"js": ["content.js"],
"run_at": "document_start",
"matches": [
"<all_urls>"
]
}
]
}

View File

@@ -8,3 +8,6 @@
2025-05-05: Deploy executor partition 4. 2025-05-05: Deploy executor partition 4.
2025-05-05: Deploy control. 2025-05-05: Deploy control.
2025-05-08: Deploy assistant. 2025-05-08: Deploy assistant.
2025-05-17: Redeploy all.
2025-05-28: Deploy assistant and browserless.
2025-06-06: Deploy assistant and browserless.

View File

@@ -1,61 +0,0 @@
# This docker-compose file is for the screenshot-capture-tool service.
#
# It is a standalone daemon that captures screenshots of web pages, based
# on the domain database of Marginalia Search.
#
# It does not start the search engine itself.
#
x-svc: &service
env_file:
- "run/env/service.env"
volumes:
- conf:/wmsa/conf:ro
- data:/wmsa/data
- logs:/var/log/wmsa
networks:
- wmsa
services:
screenshot-capture-tool:
<<: *service
image: "marginalia/screenshot-capture-tool"
container_name: "screenshot-capture-tool"
networks:
- wmsa
- headlesschrome
depends_on:
- browserless
browserless:
<<: *service
image: "browserless/chrome"
container_name: "headlesschrome"
env_file:
- "run/env/browserless.env"
ports:
- "3000:3000"
networks:
- wmsa
- headlesschrome
networks:
wmsa:
headlesschrome:
volumes:
logs:
driver: local
driver_opts:
type: none
o: bind
device: run/logs
conf:
driver: local
driver_opts:
type: none
o: bind
device: run/conf
data:
driver: local
driver_opts:
type: none
o: bind
device: run/data

View File

@@ -20,6 +20,7 @@ include 'code:functions:favicon'
include 'code:functions:favicon:api' include 'code:functions:favicon:api'
include 'code:functions:domain-info' include 'code:functions:domain-info'
include 'code:functions:domain-info:api' include 'code:functions:domain-info:api'
include 'code:functions:nsfw-domain-filter'
include 'code:functions:link-graph:partition' include 'code:functions:link-graph:partition'
include 'code:functions:link-graph:aggregate' include 'code:functions:link-graph:aggregate'
@@ -93,6 +94,7 @@ include 'code:tools:experiment-runner'
include 'code:tools:screenshot-capture-tool' include 'code:tools:screenshot-capture-tool'
include 'code:tools:load-test' include 'code:tools:load-test'
include 'code:tools:integration-test' include 'code:tools:integration-test'
include 'code:tools:browserless'
include 'third-party:porterstemmer' include 'third-party:porterstemmer'
include 'third-party:symspell' include 'third-party:symspell'
@@ -152,9 +154,9 @@ dependencyResolutionManagement {
library('guice', 'com.google.inject', 'guice').version('7.0.0') library('guice', 'com.google.inject', 'guice').version('7.0.0')
library('guava', 'com.google.guava', 'guava').version('32.0.1-jre') library('guava', 'com.google.guava', 'guava').version('32.0.1-jre')
library('protobuf', 'com.google.protobuf', 'protobuf-java').version('3.16.3') library('protobuf', 'com.google.protobuf', 'protobuf-java').version('3.16.3')
library('grpc-protobuf', 'io.grpc', 'grpc-protobuf').version('1.49.2') library('grpc-protobuf', 'io.grpc', 'grpc-protobuf').version('1.73.0')
library('grpc-stub', 'io.grpc', 'grpc-stub').version('1.49.2') library('grpc-stub', 'io.grpc', 'grpc-stub').version('1.73.0')
library('grpc-netty', 'io.grpc', 'grpc-netty-shaded').version('1.49.2') library('grpc-netty', 'io.grpc', 'grpc-netty-shaded').version('1.73.0')
library('prometheus', 'io.prometheus', 'simpleclient').version('0.16.0') library('prometheus', 'io.prometheus', 'simpleclient').version('0.16.0')
library('prometheus-servlet', 'io.prometheus', 'simpleclient_servlet').version('0.16.0') library('prometheus-servlet', 'io.prometheus', 'simpleclient_servlet').version('0.16.0')

View File

@@ -5,9 +5,6 @@ import subprocess, os
from typing import List, Set, Dict, Optional from typing import List, Set, Dict, Optional
import argparse import argparse
build_dir = "/app/search.marginalia.nu/build"
docker_dir = "/app/search.marginalia.nu/docker"
@dataclass @dataclass
class ServiceConfig: class ServiceConfig:
"""Configuration for a service""" """Configuration for a service"""
@@ -17,6 +14,99 @@ class ServiceConfig:
deploy_tier: int deploy_tier: int
groups: Set[str] groups: Set[str]
# Define the service configurations
build_dir = "/app/search.marginalia.nu/build"
docker_dir = "/app/search.marginalia.nu/docker"
SERVICE_CONFIG = {
'search': ServiceConfig(
gradle_target=':code:services-application:search-service:docker',
docker_name='search-service',
instances=2,
deploy_tier=2,
groups={"all", "frontend", "core"}
),
'search-legacy': ServiceConfig(
gradle_target=':code:services-application:search-service-legacy:docker',
docker_name='search-service-legacy',
instances=None,
deploy_tier=3,
groups={"all", "frontend", "core"}
),
'api': ServiceConfig(
gradle_target=':code:services-application:api-service:docker',
docker_name='api-service',
instances=2,
deploy_tier=1,
groups={"all", "core"}
),
'browserless': ServiceConfig(
gradle_target=':code:tools:browserless:docker',
docker_name='browserless',
instances=None,
deploy_tier=2,
groups={"all", "core"}
),
'assistant': ServiceConfig(
gradle_target=':code:services-core:assistant-service:docker',
docker_name='assistant-service',
instances=2,
deploy_tier=2,
groups={"all", "core"}
),
'explorer': ServiceConfig(
gradle_target=':code:services-application:explorer-service:docker',
docker_name='explorer-service',
instances=None,
deploy_tier=1,
groups={"all", "extra"}
),
'dating': ServiceConfig(
gradle_target=':code:services-application:dating-service:docker',
docker_name='dating-service',
instances=None,
deploy_tier=1,
groups={"all", "extra"}
),
'index': ServiceConfig(
gradle_target=':code:services-core:index-service:docker',
docker_name='index-service',
instances=10,
deploy_tier=3,
groups={"all", "index"}
),
'executor': ServiceConfig(
gradle_target=':code:services-core:executor-service:docker',
docker_name='executor-service',
instances=10,
deploy_tier=3,
groups={"all", "executor"}
),
'control': ServiceConfig(
gradle_target=':code:services-core:control-service:docker',
docker_name='control-service',
instances=None,
deploy_tier=0,
groups={"all", "core"}
),
'status': ServiceConfig(
gradle_target=':code:services-application:status-service:docker',
docker_name='status-service',
instances=None,
deploy_tier=4,
groups={"all"}
),
'query': ServiceConfig(
gradle_target=':code:services-core:query-service:docker',
docker_name='query-service',
instances=2,
deploy_tier=2,
groups={"all", "query"}
),
}
@dataclass @dataclass
class DeploymentPlan: class DeploymentPlan:
services_to_build: List[str] services_to_build: List[str]
@@ -76,7 +166,7 @@ def parse_deployment_tags(
instances_to_hold = set() instances_to_hold = set()
available_services = set(service_config.keys()) available_services = set(service_config.keys())
available_groups = set() available_groups = set.union(*[service.groups for service in service_config.values()])
partitions = set() partitions = set()
@@ -89,7 +179,6 @@ def parse_deployment_tags(
partitions.add(int(p)) partitions.add(int(p))
if tag.startswith('deploy:'): if tag.startswith('deploy:'):
parts = tag[7:].strip().split(',') parts = tag[7:].strip().split(',')
for part in parts: for part in parts:
part = part.strip() part = part.strip()
@@ -250,85 +339,7 @@ def add_tags(tags: str) -> None:
# Example usage: # Example usage:
if __name__ == '__main__': if __name__ == '__main__':
# Define service configuration # Define service configuration
SERVICE_CONFIG = {
'search': ServiceConfig(
gradle_target=':code:services-application:search-service:docker',
docker_name='search-service',
instances=2,
deploy_tier=2,
groups={"all", "frontend", "core"}
),
'search-legacy': ServiceConfig(
gradle_target=':code:services-application:search-service-legacy:docker',
docker_name='search-service-legacy',
instances=None,
deploy_tier=3,
groups={"all", "frontend", "core"}
),
'api': ServiceConfig(
gradle_target=':code:services-application:api-service:docker',
docker_name='api-service',
instances=2,
deploy_tier=1,
groups={"all", "core"}
),
'assistant': ServiceConfig(
gradle_target=':code:services-core:assistant-service:docker',
docker_name='assistant-service',
instances=2,
deploy_tier=2,
groups={"all", "core"}
),
'explorer': ServiceConfig(
gradle_target=':code:services-application:explorer-service:docker',
docker_name='explorer-service',
instances=None,
deploy_tier=1,
groups={"all", "extra"}
),
'dating': ServiceConfig(
gradle_target=':code:services-application:dating-service:docker',
docker_name='dating-service',
instances=None,
deploy_tier=1,
groups={"all", "extra"}
),
'index': ServiceConfig(
gradle_target=':code:services-core:index-service:docker',
docker_name='index-service',
instances=10,
deploy_tier=3,
groups={"all", "index"}
),
'executor': ServiceConfig(
gradle_target=':code:services-core:executor-service:docker',
docker_name='executor-service',
instances=10,
deploy_tier=3,
groups={"all", "executor"}
),
'control': ServiceConfig(
gradle_target=':code:services-core:control-service:docker',
docker_name='control-service',
instances=None,
deploy_tier=0,
groups={"all", "core"}
),
'status': ServiceConfig(
gradle_target=':code:services-application:status-service:docker',
docker_name='status-service',
instances=None,
deploy_tier=4,
groups={"all"}
),
'query': ServiceConfig(
gradle_target=':code:services-core:query-service:docker',
docker_name='query-service',
instances=2,
deploy_tier=2,
groups={"all", "query"}
),
}
try: try:
parser = argparse.ArgumentParser( parser = argparse.ArgumentParser(
@@ -337,7 +348,7 @@ if __name__ == '__main__':
parser.add_argument('-v', '--verify', help='Verify the tags are valid, if present', action='store_true') parser.add_argument('-v', '--verify', help='Verify the tags are valid, if present', action='store_true')
parser.add_argument('-a', '--add', help='Add the tags provided as a new deployment tag, usually combined with -t', action='store_true') parser.add_argument('-a', '--add', help='Add the tags provided as a new deployment tag, usually combined with -t', action='store_true')
parser.add_argument('-t', '--tag', help='Use the specified tag value instead of the head git tag starting with deploy-') parser.add_argument('-t', '--tag', help='Use the specified tag value instead of the head git tag starting with deploy-; Expecting tags on the format "+service", "-service", or "group"')
args = parser.parse_args() args = parser.parse_args()
tags = args.tag tags = args.tag
@@ -365,7 +376,7 @@ if __name__ == '__main__':
build_and_deploy(plan, SERVICE_CONFIG) build_and_deploy(plan, SERVICE_CONFIG)
else: else:
print("No tags found") print("No tags found.")
except ValueError as e: except ValueError as e:
print(f"Error: {e}") print(f"Error: {e}")