mirror of
https://github.com/DioCrafts/OxiCloud.git
synced 2025-10-06 00:22:38 +02:00
Initial commit
This commit is contained in:
63
.gitignore
vendored
Normal file
63
.gitignore
vendored
Normal file
@@ -0,0 +1,63 @@
|
||||
# Generated by Cargo
|
||||
/target/
|
||||
**/target/
|
||||
|
||||
# Remove Cargo.lock from gitignore if creating an executable, leave it for libraries
|
||||
# More information here https://doc.rust-lang.org/cargo/guide/cargo-toml-vs-cargo-lock.html
|
||||
# Cargo.lock
|
||||
|
||||
# These are backup files generated by rustfmt
|
||||
**/*.rs.bk
|
||||
|
||||
# MSVC Windows builds of rustc generate these, which store debugging information
|
||||
*.pdb
|
||||
|
||||
# Rust debug symbols
|
||||
*.dSYM/
|
||||
*.su
|
||||
*.idb
|
||||
|
||||
# Build cache
|
||||
.cargo/
|
||||
|
||||
# Visual Studio Code directory
|
||||
.vscode/
|
||||
|
||||
# JetBrains IDEs
|
||||
.idea/
|
||||
|
||||
# MacOS specific
|
||||
.DS_Store
|
||||
.AppleDouble
|
||||
.LSOverride
|
||||
|
||||
# Linux specific
|
||||
*~
|
||||
.directory
|
||||
.Trash-*
|
||||
|
||||
# Windows specific
|
||||
Thumbs.db
|
||||
ehthumbs.db
|
||||
Desktop.ini
|
||||
|
||||
# Node.js (if used for frontend)
|
||||
node_modules/
|
||||
npm-debug.log
|
||||
|
||||
# Environment variables
|
||||
.env
|
||||
.env.local
|
||||
.env.development.local
|
||||
.env.test.local
|
||||
.env.production.local
|
||||
|
||||
# Log files
|
||||
*.log
|
||||
logs/
|
||||
|
||||
# Temporary files
|
||||
*.tmp
|
||||
*.bak
|
||||
*.swp
|
||||
*.swo
|
32
CLAUDE.md
Normal file
32
CLAUDE.md
Normal file
@@ -0,0 +1,32 @@
|
||||
# OxiCloud Development Guide
|
||||
|
||||
## Build Commands
|
||||
```bash
|
||||
cargo build # Build the project
|
||||
cargo run # Run the project locally (server at http://127.0.0.1:8085)
|
||||
cargo test # Run all tests
|
||||
cargo test -- --nocapture # Run tests with output displayed
|
||||
cargo test <test_name> # Run a specific test (e.g., cargo test file_service)
|
||||
cargo clippy # Run linter to catch common mistakes
|
||||
cargo clippy --fix # Fix auto-fixable linting issues
|
||||
cargo fmt --check # Check code formatting without changing files
|
||||
cargo fmt # Format code according to Rust conventions
|
||||
RUST_LOG=debug cargo run # Run with detailed logging for debugging
|
||||
```
|
||||
|
||||
## Code Style Guidelines
|
||||
- **Architecture**: Follow Clean Architecture layers (domain, application, infrastructure, interfaces)
|
||||
- **Naming**: Use `snake_case` for files, modules, functions, variables; `PascalCase` for types/structs/enums
|
||||
- **Modules**: Use mod.rs files for explicit exports with visibility modifiers (pub, pub(crate))
|
||||
- **Error Handling**: Use Result<T, E> with thiserror for custom error types; propagate errors with ? operator
|
||||
- **Comments**: Document public APIs with /// doc comments, explain "why" not "what"
|
||||
- **Imports**: Group imports: 1) std, 2) external crates, 3) internal modules (with blank lines between)
|
||||
- **Async**: Use async-trait for repository interfaces; handle futures with .await and tokio runtime
|
||||
- **Testing**: Write unit tests in the same file as implementation (bottom of file, in a tests module)
|
||||
- **Dependencies**: Use axum for web API, tower-http for middleware, serde for serialization
|
||||
- **Logging**: Use tracing crate with appropriate levels (debug, info, warn, error)
|
||||
- **Repository Pattern**: Define interfaces in domain layer, implement in infrastructure layer
|
||||
- **I18n**: Store translations in JSON files under static/locales/, use i18n service for text lookups
|
||||
|
||||
## Project Structure
|
||||
OxiCloud is a NextCloud-like file storage system built in Rust with a focus on performance and security. It provides a clean REST API and web interface for file management using a layered architecture approach. The roadmap in TODO-LIST.md outlines planned features including enhanced folder support, file previews, user authentication, sharing, and a sync client.
|
1139
Cargo.lock
generated
Normal file
1139
Cargo.lock
generated
Normal file
File diff suppressed because it is too large
Load Diff
20
Cargo.toml
Normal file
20
Cargo.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "oxicloud"
|
||||
version = "0.1.0"
|
||||
edition = "2021"
|
||||
|
||||
[dependencies]
|
||||
axum = { version = "0.8.1", features = ["multipart"] }
|
||||
tokio = { version = "1.44.1", features = ["full"] }
|
||||
tokio-util = { version = "0.7.14", features = ["io"] }
|
||||
tower = "0.5.2"
|
||||
tower-http = { version = "0.6.2", features = ["fs", "compression-gzip", "trace", "cors"] }
|
||||
tracing = "0.1.41"
|
||||
tracing-subscriber = { version = "0.3.19", features = ["env-filter"] }
|
||||
serde = { version = "1.0.219", features = ["derive"] }
|
||||
serde_json = "1.0.140"
|
||||
futures = "0.3.31"
|
||||
mime_guess = "2.0.5"
|
||||
uuid = { version = "1.16.0", features = ["v4", "serde"] }
|
||||
async-trait = "0.1.88"
|
||||
thiserror = "2.0.12"
|
172
TODO-LIST.md
Normal file
172
TODO-LIST.md
Normal file
@@ -0,0 +1,172 @@
|
||||
# OxiCloud TODO List
|
||||
|
||||
Este documento contiene la lista de tareas para el desarrollo de OxiCloud, un sistema de almacenamiento en la nube minimalista y eficiente similar a NextCloud pero optimizado para rendimiento.
|
||||
|
||||
## Fase 1: Funcionalidades básicas de archivos
|
||||
|
||||
### Sistema de carpetas
|
||||
- [ ] Implementar API para crear carpetas
|
||||
- [ ] Añadir soporte de rutas jerárquicas en el backend
|
||||
- [ ] Actualizar UI para mostrar estructura de carpetas (árbol)
|
||||
- [ ] Implementar navegación entre carpetas
|
||||
- [ ] Añadir funcionalidad para renombrar carpetas
|
||||
- [ ] Agregar opción de mover archivos entre carpetas
|
||||
|
||||
### Previsualización de archivos
|
||||
- [ ] Implementar visor de imágenes integrado
|
||||
- [ ] Añadir visor de PDF básico
|
||||
- [ ] Generar miniaturas para imágenes
|
||||
- [ ] Implementar iconos específicos según tipo de archivo
|
||||
- [ ] Añadir vista previa de texto/código
|
||||
|
||||
### Buscador mejorado
|
||||
- [ ] Implementar búsqueda por nombre
|
||||
- [ ] Añadir filtros por tipo de archivo
|
||||
- [ ] Implementar búsqueda por rango de fechas
|
||||
- [ ] Agregar filtro por tamaño de archivo
|
||||
- [ ] Añadir búsqueda dentro de carpetas específicas
|
||||
- [ ] Implementar caché para resultados de búsqueda
|
||||
|
||||
### Optimizaciones UI/UX
|
||||
- [ ] Mejorar diseño responsive para móviles
|
||||
- [ ] Implementar drag & drop entre carpetas
|
||||
- [ ] Añadir soporte para selección múltiple de archivos
|
||||
- [ ] Implementar subida de archivos múltiples
|
||||
- [ ] Añadir indicadores de progreso para operaciones largas
|
||||
- [ ] Implementar notificaciones en UI para eventos
|
||||
|
||||
## Fase 2: Autenticación y multiusuario
|
||||
|
||||
### Sistema de usuarios
|
||||
- [ ] Diseñar modelo de datos para usuarios
|
||||
- [ ] Implementar registro de usuarios
|
||||
- [ ] Crear sistema de inicio de sesión
|
||||
- [ ] Añadir página de perfil de usuario
|
||||
- [ ] Implementar recuperación de contraseña
|
||||
- [ ] Separar almacenamiento por usuario
|
||||
|
||||
### Cuotas y permisos
|
||||
- [ ] Implementar sistema de cuotas de almacenamiento
|
||||
- [ ] Añadir sistema básico de roles (admin/usuario)
|
||||
- [ ] Crear panel de administración
|
||||
- [ ] Implementar permisos a nivel de carpeta
|
||||
- [ ] Añadir monitoreo de uso de almacenamiento
|
||||
|
||||
### Seguridad básica
|
||||
- [ ] Implementar hashing seguro de contraseñas con Argon2
|
||||
- [ ] Añadir gestión de sesiones
|
||||
- [ ] Implementar token de autenticación JWT
|
||||
- [ ] Añadir protección CSRF
|
||||
- [ ] Implementar límites de intentos de inicio de sesión
|
||||
- [ ] Crear sistema de registro de actividad (logs)
|
||||
|
||||
## Fase 3: Características de colaboración
|
||||
|
||||
### Compartir archivos
|
||||
- [ ] Implementar generación de enlaces compartidos
|
||||
- [ ] Añadir configuración de permisos para enlaces
|
||||
- [ ] Implementar protección con contraseña para enlaces
|
||||
- [ ] Añadir fechas de expiración para enlaces compartidos
|
||||
- [ ] Crear página para gestionar todos los recursos compartidos
|
||||
- [ ] Implementar notificaciones al compartir
|
||||
|
||||
### Papelera de reciclaje
|
||||
- [ ] Diseñar modelo para almacenar archivos eliminados
|
||||
- [ ] Implementar eliminación soft (mover a papelera)
|
||||
- [ ] Añadir funcionalidad para restaurar archivos
|
||||
- [ ] Implementar purga automática por tiempo
|
||||
- [ ] Añadir opción de vaciar papelera manualmente
|
||||
- [ ] Implementar límites de almacenamiento para papelera
|
||||
|
||||
### Registro de actividad
|
||||
- [ ] Crear modelo para eventos de actividad
|
||||
- [ ] Implementar registro de operaciones CRUD
|
||||
- [ ] Añadir registro de accesos y eventos de seguridad
|
||||
- [ ] Crear página de historial de actividad
|
||||
- [ ] Implementar filtros para el registro de actividad
|
||||
- [ ] Añadir exportación de registro
|
||||
|
||||
## Fase 4: API y sincronización
|
||||
|
||||
### API REST completa
|
||||
- [ ] Diseñar especificación OpenAPI
|
||||
- [ ] Implementar endpoints para operaciones de archivos
|
||||
- [ ] Añadir endpoints para usuarios y autenticación
|
||||
- [ ] Implementar documentación automática (Swagger)
|
||||
- [ ] Crear sistema de tokens de API
|
||||
- [ ] Implementar limitación de tasa (rate limiting)
|
||||
- [ ] Añadir versionado de API
|
||||
|
||||
### Soporte WebDAV
|
||||
- [ ] Implementar servidor WebDAV básico
|
||||
- [ ] Añadir autenticación para WebDAV
|
||||
- [ ] Implementar operaciones PROPFIND
|
||||
- [ ] Añadir soporte para bloqueo (locking)
|
||||
- [ ] Probar compatibilidad con clientes estándar
|
||||
- [ ] Optimizar rendimiento WebDAV
|
||||
|
||||
### Cliente de sincronización
|
||||
- [ ] Diseñar arquitectura de cliente en Rust
|
||||
- [ ] Implementar sincronización unidireccional
|
||||
- [ ] Añadir sincronización bidireccional
|
||||
- [ ] Implementar detección de conflictos
|
||||
- [ ] Añadir opciones de configuración
|
||||
- [ ] Crear versión mínima de cliente para Windows/macOS/Linux
|
||||
|
||||
## Fase 5: Funcionalidades avanzadas
|
||||
|
||||
### Cifrado de archivos
|
||||
- [ ] Investigar y seleccionar algoritmos de cifrado
|
||||
- [ ] Implementar cifrado en reposo para archivos
|
||||
- [ ] Añadir gestión de claves
|
||||
- [ ] Implementar cifrado para archivos compartidos
|
||||
- [ ] Crear documentación de seguridad
|
||||
|
||||
### Versionado de archivos
|
||||
- [ ] Diseñar sistema de almacenamiento de versiones
|
||||
- [ ] Implementar historial de versiones
|
||||
- [ ] Añadir visualización de diferencias
|
||||
- [ ] Implementar restauración de versiones
|
||||
- [ ] Añadir políticas de retención de versiones
|
||||
|
||||
### Aplicaciones básicas
|
||||
- [ ] Diseñar sistema de plugins/apps
|
||||
- [ ] Implementar visor/editor de texto básico
|
||||
- [ ] Añadir aplicación de notas simple
|
||||
- [ ] Implementar calendario básico
|
||||
- [ ] Crear API para aplicaciones de terceros
|
||||
|
||||
## Optimizaciones continuas
|
||||
|
||||
### Backend
|
||||
- [ ] Implementar caché de archivos con Rust
|
||||
- [ ] Optimizar transmisión de archivos grandes
|
||||
- [ ] Añadir compresión adaptativa según tipo de archivo
|
||||
- [ ] Implementar procesamiento asíncrono para tareas pesadas
|
||||
- [ ] Optimizar consultas a base de datos
|
||||
- [ ] Implementar estrategias de escalado
|
||||
|
||||
### Frontend
|
||||
- [ ] Optimizar carga inicial de assets
|
||||
- [ ] Implementar lazy loading para listas grandes
|
||||
- [ ] Añadir caché local (localStorage/IndexedDB)
|
||||
- [ ] Optimizar renderizado de UI
|
||||
- [ ] Implementar precarga inteligente (prefetching)
|
||||
- [ ] Añadir soporte offline básico
|
||||
|
||||
### Almacenamiento
|
||||
- [ ] Investigar opciones de deduplicación
|
||||
- [ ] Implementar almacenamiento por bloques
|
||||
- [ ] Añadir compresión transparente según tipo de archivo
|
||||
- [ ] Implementar rotación y archivado de logs
|
||||
- [ ] Crear sistema de respaldo automatizado
|
||||
- [ ] Añadir soporte para almacenamiento distribuido
|
||||
|
||||
## Infraestructura y despliegue
|
||||
|
||||
- [ ] Crear configuración para Docker
|
||||
- [ ] Implementar CI/CD con GitHub Actions
|
||||
- [ ] Añadir pruebas automatizadas
|
||||
- [ ] Crear documentación de instalación
|
||||
- [ ] Implementar monitoreo y alertas
|
||||
- [ ] Añadir sistema de actualizaciones automáticas
|
8
identifier.sh
Normal file
8
identifier.sh
Normal file
@@ -0,0 +1,8 @@
|
||||
while IFS= read -r -d '' file; do
|
||||
if grep -Iq . "$file"; then
|
||||
echo "===== $file ====="
|
||||
cat "$file"
|
||||
echo -e "\n"
|
||||
fi
|
||||
done < <(find . -type f -print0)
|
||||
|
45
src/application/dtos/file_dto.rs
Normal file
45
src/application/dtos/file_dto.rs
Normal file
@@ -0,0 +1,45 @@
|
||||
use serde::Serialize;
|
||||
use crate::domain::entities::file::File;
|
||||
|
||||
/// DTO for file responses
|
||||
#[derive(Debug, Serialize)]
|
||||
pub struct FileDto {
|
||||
/// File ID
|
||||
pub id: String,
|
||||
|
||||
/// File name
|
||||
pub name: String,
|
||||
|
||||
/// Path to the file (relative)
|
||||
pub path: String,
|
||||
|
||||
/// Size in bytes
|
||||
pub size: u64,
|
||||
|
||||
/// MIME type
|
||||
pub mime_type: String,
|
||||
|
||||
/// Parent folder ID
|
||||
pub folder_id: Option<String>,
|
||||
|
||||
/// Creation timestamp
|
||||
pub created_at: u64,
|
||||
|
||||
/// Last modification timestamp
|
||||
pub modified_at: u64,
|
||||
}
|
||||
|
||||
impl From<File> for FileDto {
|
||||
fn from(file: File) -> Self {
|
||||
Self {
|
||||
id: file.id,
|
||||
name: file.name,
|
||||
path: file.path.to_string_lossy().to_string(),
|
||||
size: file.size,
|
||||
mime_type: file.mime_type,
|
||||
folder_id: file.folder_id,
|
||||
created_at: file.created_at,
|
||||
modified_at: file.modified_at,
|
||||
}
|
||||
}
|
||||
}
|
66
src/application/dtos/folder_dto.rs
Normal file
66
src/application/dtos/folder_dto.rs
Normal file
@@ -0,0 +1,66 @@
|
||||
use serde::{Serialize, Deserialize};
|
||||
use crate::domain::entities::folder::Folder;
|
||||
|
||||
/// DTO for folder creation requests
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct CreateFolderDto {
|
||||
/// Name of the folder to create
|
||||
pub name: String,
|
||||
|
||||
/// Parent folder ID (None for root level)
|
||||
pub parent_id: Option<String>,
|
||||
}
|
||||
|
||||
/// DTO for folder rename requests
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct RenameFolderDto {
|
||||
/// New name for the folder
|
||||
pub name: String,
|
||||
}
|
||||
|
||||
/// DTO for folder move requests
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct MoveFolderDto {
|
||||
/// New parent folder ID (None for root level)
|
||||
pub parent_id: Option<String>,
|
||||
}
|
||||
|
||||
/// DTO for folder responses
|
||||
#[derive(Debug, Serialize)]
|
||||
pub struct FolderDto {
|
||||
/// Folder ID
|
||||
pub id: String,
|
||||
|
||||
/// Folder name
|
||||
pub name: String,
|
||||
|
||||
/// Path to the folder (relative)
|
||||
pub path: String,
|
||||
|
||||
/// Parent folder ID
|
||||
pub parent_id: Option<String>,
|
||||
|
||||
/// Creation timestamp
|
||||
pub created_at: u64,
|
||||
|
||||
/// Last modification timestamp
|
||||
pub modified_at: u64,
|
||||
|
||||
/// Whether this is a root folder
|
||||
pub is_root: bool,
|
||||
}
|
||||
|
||||
impl From<Folder> for FolderDto {
|
||||
fn from(folder: Folder) -> Self {
|
||||
let is_root = folder.parent_id.is_none();
|
||||
Self {
|
||||
id: folder.id,
|
||||
name: folder.name,
|
||||
path: folder.path.to_string_lossy().to_string(),
|
||||
parent_id: folder.parent_id,
|
||||
created_at: folder.created_at,
|
||||
modified_at: folder.modified_at,
|
||||
is_root,
|
||||
}
|
||||
}
|
||||
}
|
62
src/application/dtos/i18n_dto.rs
Normal file
62
src/application/dtos/i18n_dto.rs
Normal file
@@ -0,0 +1,62 @@
|
||||
use serde::{Serialize, Deserialize};
|
||||
use crate::domain::services::i18n_service::Locale;
|
||||
|
||||
/// DTO for locale information
|
||||
#[derive(Debug, Serialize, Deserialize)]
|
||||
pub struct LocaleDto {
|
||||
/// Locale code (e.g., "en", "es")
|
||||
pub code: String,
|
||||
|
||||
/// Locale name in its own language (e.g., "English", "Español")
|
||||
pub name: String,
|
||||
}
|
||||
|
||||
impl From<Locale> for LocaleDto {
|
||||
fn from(locale: Locale) -> Self {
|
||||
let (code, name) = match locale {
|
||||
Locale::English => ("en", "English"),
|
||||
Locale::Spanish => ("es", "Español"),
|
||||
};
|
||||
|
||||
Self {
|
||||
code: code.to_string(),
|
||||
name: name.to_string(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// DTO for translation request
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct TranslationRequestDto {
|
||||
/// The translation key
|
||||
pub key: String,
|
||||
|
||||
/// The locale code (optional, defaults to "en")
|
||||
pub locale: Option<String>,
|
||||
}
|
||||
|
||||
/// DTO for translation response
|
||||
#[derive(Debug, Serialize)]
|
||||
pub struct TranslationResponseDto {
|
||||
/// The translation key
|
||||
pub key: String,
|
||||
|
||||
/// The locale code used for translation
|
||||
pub locale: String,
|
||||
|
||||
/// The translated text
|
||||
pub text: String,
|
||||
}
|
||||
|
||||
/// DTO for translation error
|
||||
#[derive(Debug, Serialize)]
|
||||
pub struct TranslationErrorDto {
|
||||
/// The translation key that was not found
|
||||
pub key: String,
|
||||
|
||||
/// The locale code used for translation
|
||||
pub locale: String,
|
||||
|
||||
/// The error message
|
||||
pub error: String,
|
||||
}
|
4
src/application/dtos/mod.rs
Normal file
4
src/application/dtos/mod.rs
Normal file
@@ -0,0 +1,4 @@
|
||||
pub mod file_dto;
|
||||
pub mod folder_dto;
|
||||
pub mod i18n_dto;
|
||||
|
3
src/application/mod.rs
Normal file
3
src/application/mod.rs
Normal file
@@ -0,0 +1,3 @@
|
||||
pub mod dtos;
|
||||
pub mod services;
|
||||
|
125
src/application/services/file_service.rs
Normal file
125
src/application/services/file_service.rs
Normal file
@@ -0,0 +1,125 @@
|
||||
use std::sync::Arc;
|
||||
|
||||
use crate::domain::repositories::file_repository::{FileRepository, FileRepositoryResult};
|
||||
use crate::application::dtos::file_dto::FileDto;
|
||||
|
||||
/// Service for file operations
|
||||
pub struct FileService {
|
||||
file_repository: Arc<dyn FileRepository>,
|
||||
}
|
||||
|
||||
impl FileService {
|
||||
/// Creates a new file service
|
||||
pub fn new(file_repository: Arc<dyn FileRepository>) -> Self {
|
||||
Self { file_repository }
|
||||
}
|
||||
|
||||
/// Uploads a new file from bytes
|
||||
pub async fn upload_file_from_bytes(
|
||||
&self,
|
||||
name: String,
|
||||
folder_id: Option<String>,
|
||||
content_type: String,
|
||||
content: Vec<u8>,
|
||||
) -> FileRepositoryResult<FileDto>
|
||||
{
|
||||
let file = self.file_repository.save_file_from_bytes(name, folder_id, content_type, content).await?;
|
||||
Ok(FileDto::from(file))
|
||||
}
|
||||
|
||||
/// Gets a file by ID
|
||||
pub async fn get_file(&self, id: &str) -> FileRepositoryResult<FileDto> {
|
||||
let file = self.file_repository.get_file_by_id(id).await?;
|
||||
Ok(FileDto::from(file))
|
||||
}
|
||||
|
||||
/// Lists files in a folder
|
||||
pub async fn list_files(&self, folder_id: Option<&str>) -> FileRepositoryResult<Vec<FileDto>> {
|
||||
let files = self.file_repository.list_files(folder_id).await?;
|
||||
Ok(files.into_iter().map(FileDto::from).collect())
|
||||
}
|
||||
|
||||
/// Deletes a file
|
||||
pub async fn delete_file(&self, id: &str) -> FileRepositoryResult<()> {
|
||||
self.file_repository.delete_file(id).await
|
||||
}
|
||||
|
||||
/// Gets file content
|
||||
pub async fn get_file_content(&self, id: &str) -> FileRepositoryResult<Vec<u8>> {
|
||||
self.file_repository.get_file_content(id).await
|
||||
}
|
||||
|
||||
/// Moves a file to a new folder implementing direct save with new location without deleting first
|
||||
pub async fn move_file(&self, file_id: &str, folder_id: Option<String>) -> FileRepositoryResult<FileDto> {
|
||||
// Get the current file complete info
|
||||
let source_file = match self.file_repository.get_file_by_id(file_id).await {
|
||||
Ok(f) => f,
|
||||
Err(e) => {
|
||||
tracing::error!("Error al obtener archivo (ID: {}): {}", file_id, e);
|
||||
return Err(e);
|
||||
}
|
||||
};
|
||||
|
||||
tracing::info!("Moviendo archivo: {} (ID: {}) de carpeta: {:?} a carpeta: {:?}",
|
||||
source_file.name, file_id, source_file.folder_id, folder_id);
|
||||
|
||||
// Special handling for PDF files
|
||||
let is_pdf = source_file.name.to_lowercase().ends_with(".pdf");
|
||||
if is_pdf {
|
||||
tracing::info!("Moviendo un archivo PDF: {}", source_file.name);
|
||||
}
|
||||
|
||||
// No hacer nada si ya estamos en la carpeta de destino
|
||||
if source_file.folder_id == folder_id {
|
||||
tracing::info!("El archivo ya está en la carpeta de destino, no es necesario moverlo");
|
||||
return Ok(FileDto::from(source_file));
|
||||
}
|
||||
|
||||
// Step 1: Get file content
|
||||
tracing::info!("Leyendo contenido del archivo: {}", source_file.name);
|
||||
let content = match self.file_repository.get_file_content(file_id).await {
|
||||
Ok(content) => {
|
||||
tracing::info!("Contenido del archivo leído correctamente: {} bytes", content.len());
|
||||
content
|
||||
},
|
||||
Err(e) => {
|
||||
tracing::error!("Error al leer el contenido del archivo {}: {}", file_id, e);
|
||||
return Err(e);
|
||||
}
|
||||
};
|
||||
|
||||
// Step 2: Save the file to the new location with a new ID
|
||||
tracing::info!("Guardando archivo en nueva ubicación: {} en carpeta: {:?}", source_file.name, folder_id);
|
||||
let new_file = match self.file_repository.save_file_from_bytes(
|
||||
source_file.name.clone(),
|
||||
folder_id.clone(),
|
||||
source_file.mime_type.clone(),
|
||||
content
|
||||
).await {
|
||||
Ok(file) => {
|
||||
tracing::info!("Archivo guardado en nueva ubicación con ID: {}", file.id);
|
||||
file
|
||||
},
|
||||
Err(e) => {
|
||||
tracing::error!("Error al guardar archivo en nueva ubicación: {}", e);
|
||||
return Err(e);
|
||||
}
|
||||
};
|
||||
|
||||
// Step 3: Only after ensuring new file is saved, try to delete the old file
|
||||
// If this fails, it's not critical - we already have the file in the new location
|
||||
tracing::info!("Eliminando archivo original con ID: {}", file_id);
|
||||
match self.file_repository.delete_file(file_id).await {
|
||||
Ok(_) => tracing::info!("Archivo original eliminado correctamente"),
|
||||
Err(e) => {
|
||||
tracing::warn!("Error al eliminar archivo original (ID: {}): {} - archivo duplicado posible", file_id, e);
|
||||
// Continue even if delete fails - at worst we'll have duplicate files
|
||||
}
|
||||
}
|
||||
|
||||
tracing::info!("Archivo movido exitosamente: {} (ID: {}) a carpeta: {:?}",
|
||||
new_file.name, new_file.id, folder_id);
|
||||
|
||||
Ok(FileDto::from(new_file))
|
||||
}
|
||||
}
|
68
src/application/services/folder_service.rs
Normal file
68
src/application/services/folder_service.rs
Normal file
@@ -0,0 +1,68 @@
|
||||
use std::path::PathBuf;
|
||||
use std::sync::Arc;
|
||||
|
||||
use crate::domain::repositories::folder_repository::{FolderRepository, FolderRepositoryResult};
|
||||
use crate::application::dtos::folder_dto::{CreateFolderDto, RenameFolderDto, MoveFolderDto, FolderDto};
|
||||
|
||||
/// Service for folder operations
|
||||
pub struct FolderService {
|
||||
folder_repository: Arc<dyn FolderRepository>,
|
||||
}
|
||||
|
||||
impl FolderService {
|
||||
/// Creates a new folder service
|
||||
pub fn new(folder_repository: Arc<dyn FolderRepository>) -> Self {
|
||||
Self { folder_repository }
|
||||
}
|
||||
|
||||
/// Creates a new folder
|
||||
pub async fn create_folder(&self, dto: CreateFolderDto) -> FolderRepositoryResult<FolderDto> {
|
||||
let parent_path = match &dto.parent_id {
|
||||
Some(parent_id) => {
|
||||
let parent = self.folder_repository.get_folder_by_id(parent_id).await?;
|
||||
Some(parent.path)
|
||||
},
|
||||
None => None
|
||||
};
|
||||
|
||||
let folder = self.folder_repository.create_folder(dto.name, parent_path).await?;
|
||||
Ok(FolderDto::from(folder))
|
||||
}
|
||||
|
||||
/// Gets a folder by ID
|
||||
pub async fn get_folder(&self, id: &str) -> FolderRepositoryResult<FolderDto> {
|
||||
let folder = self.folder_repository.get_folder_by_id(id).await?;
|
||||
Ok(FolderDto::from(folder))
|
||||
}
|
||||
|
||||
/// Gets a folder by path
|
||||
#[allow(dead_code)]
|
||||
pub async fn get_folder_by_path(&self, path: &str) -> FolderRepositoryResult<FolderDto> {
|
||||
let path_buf = PathBuf::from(path);
|
||||
let folder = self.folder_repository.get_folder_by_path(&path_buf).await?;
|
||||
Ok(FolderDto::from(folder))
|
||||
}
|
||||
|
||||
/// Lists folders in a parent folder
|
||||
pub async fn list_folders(&self, parent_id: Option<&str>) -> FolderRepositoryResult<Vec<FolderDto>> {
|
||||
let folders = self.folder_repository.list_folders(parent_id).await?;
|
||||
Ok(folders.into_iter().map(FolderDto::from).collect())
|
||||
}
|
||||
|
||||
/// Renames a folder
|
||||
pub async fn rename_folder(&self, id: &str, dto: RenameFolderDto) -> FolderRepositoryResult<FolderDto> {
|
||||
let folder = self.folder_repository.rename_folder(id, dto.name).await?;
|
||||
Ok(FolderDto::from(folder))
|
||||
}
|
||||
|
||||
/// Moves a folder to a new parent
|
||||
pub async fn move_folder(&self, id: &str, dto: MoveFolderDto) -> FolderRepositoryResult<FolderDto> {
|
||||
let folder = self.folder_repository.move_folder(id, dto.parent_id.as_deref()).await?;
|
||||
Ok(FolderDto::from(folder))
|
||||
}
|
||||
|
||||
/// Deletes a folder
|
||||
pub async fn delete_folder(&self, id: &str) -> FolderRepositoryResult<()> {
|
||||
self.folder_repository.delete_folder(id).await
|
||||
}
|
||||
}
|
51
src/application/services/i18n_application_service.rs
Normal file
51
src/application/services/i18n_application_service.rs
Normal file
@@ -0,0 +1,51 @@
|
||||
use std::sync::Arc;
|
||||
|
||||
use crate::domain::services::i18n_service::{I18nService, I18nResult, Locale};
|
||||
|
||||
/// Service for i18n operations
|
||||
pub struct I18nApplicationService {
|
||||
i18n_service: Arc<dyn I18nService>,
|
||||
}
|
||||
|
||||
impl I18nApplicationService {
|
||||
/// Creates a new i18n application service
|
||||
pub fn new(i18n_service: Arc<dyn I18nService>) -> Self {
|
||||
Self { i18n_service }
|
||||
}
|
||||
|
||||
/// Get a translation for a key and locale
|
||||
pub async fn translate(&self, key: &str, locale: Option<Locale>) -> I18nResult<String> {
|
||||
let locale = locale.unwrap_or(Locale::default());
|
||||
self.i18n_service.translate(key, locale).await
|
||||
}
|
||||
|
||||
/// Load translations for a locale
|
||||
pub async fn load_translations(&self, locale: Locale) -> I18nResult<()> {
|
||||
self.i18n_service.load_translations(locale).await
|
||||
}
|
||||
|
||||
/// Load translations for all available locales
|
||||
#[allow(dead_code)]
|
||||
pub async fn load_all_translations(&self) -> Vec<(Locale, I18nResult<()>)> {
|
||||
let locales = self.i18n_service.available_locales().await;
|
||||
let mut results = Vec::new();
|
||||
|
||||
for locale in locales {
|
||||
let result = self.i18n_service.load_translations(locale).await;
|
||||
results.push((locale, result));
|
||||
}
|
||||
|
||||
results
|
||||
}
|
||||
|
||||
/// Get available locales
|
||||
pub async fn available_locales(&self) -> Vec<Locale> {
|
||||
self.i18n_service.available_locales().await
|
||||
}
|
||||
|
||||
/// Check if a locale is supported
|
||||
#[allow(dead_code)]
|
||||
pub async fn is_supported(&self, locale: Locale) -> bool {
|
||||
self.i18n_service.is_supported(locale).await
|
||||
}
|
||||
}
|
4
src/application/services/mod.rs
Normal file
4
src/application/services/mod.rs
Normal file
@@ -0,0 +1,4 @@
|
||||
pub mod file_service;
|
||||
pub mod folder_service;
|
||||
pub mod i18n_application_service;
|
||||
|
67
src/domain/entities/file.rs
Normal file
67
src/domain/entities/file.rs
Normal file
@@ -0,0 +1,67 @@
|
||||
use std::path::PathBuf;
|
||||
use serde::{Serialize, Deserialize};
|
||||
|
||||
/// Represents a file entity in the domain
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, Eq)]
|
||||
pub struct File {
|
||||
/// Unique identifier for the file
|
||||
pub id: String,
|
||||
|
||||
/// Name of the file
|
||||
pub name: String,
|
||||
|
||||
/// Path to the file (relative to user's root)
|
||||
pub path: PathBuf,
|
||||
|
||||
/// Size of the file in bytes
|
||||
pub size: u64,
|
||||
|
||||
/// MIME type of the file
|
||||
pub mime_type: String,
|
||||
|
||||
/// Parent folder ID
|
||||
pub folder_id: Option<String>,
|
||||
|
||||
/// Creation timestamp
|
||||
pub created_at: u64,
|
||||
|
||||
/// Last modification timestamp
|
||||
pub modified_at: u64,
|
||||
}
|
||||
|
||||
impl File {
|
||||
/// Creates a new file
|
||||
pub fn new(
|
||||
id: String,
|
||||
name: String,
|
||||
path: PathBuf,
|
||||
size: u64,
|
||||
mime_type: String,
|
||||
folder_id: Option<String>,
|
||||
) -> Self {
|
||||
let now = std::time::SystemTime::now()
|
||||
.duration_since(std::time::UNIX_EPOCH)
|
||||
.unwrap()
|
||||
.as_secs();
|
||||
|
||||
Self {
|
||||
id,
|
||||
name,
|
||||
path,
|
||||
size,
|
||||
mime_type,
|
||||
folder_id,
|
||||
created_at: now,
|
||||
modified_at: now,
|
||||
}
|
||||
}
|
||||
|
||||
/// Updates file modification time
|
||||
#[allow(dead_code)]
|
||||
pub fn touch(&mut self) {
|
||||
self.modified_at = std::time::SystemTime::now()
|
||||
.duration_since(std::time::UNIX_EPOCH)
|
||||
.unwrap()
|
||||
.as_secs();
|
||||
}
|
||||
}
|
57
src/domain/entities/folder.rs
Normal file
57
src/domain/entities/folder.rs
Normal file
@@ -0,0 +1,57 @@
|
||||
use std::path::PathBuf;
|
||||
use serde::{Serialize, Deserialize};
|
||||
|
||||
/// Represents a folder entity in the domain
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, Eq)]
|
||||
pub struct Folder {
|
||||
/// Unique identifier for the folder
|
||||
pub id: String,
|
||||
|
||||
/// Name of the folder
|
||||
pub name: String,
|
||||
|
||||
/// Path to the folder (relative to user's root)
|
||||
pub path: PathBuf,
|
||||
|
||||
/// Parent folder ID (None if it's a root folder)
|
||||
pub parent_id: Option<String>,
|
||||
|
||||
/// Creation timestamp
|
||||
pub created_at: u64,
|
||||
|
||||
/// Last modification timestamp
|
||||
pub modified_at: u64,
|
||||
}
|
||||
|
||||
impl Folder {
|
||||
/// Creates a new folder
|
||||
pub fn new(id: String, name: String, path: PathBuf, parent_id: Option<String>) -> Self {
|
||||
let now = std::time::SystemTime::now()
|
||||
.duration_since(std::time::UNIX_EPOCH)
|
||||
.unwrap()
|
||||
.as_secs();
|
||||
|
||||
Self {
|
||||
id,
|
||||
name,
|
||||
path,
|
||||
parent_id,
|
||||
created_at: now,
|
||||
modified_at: now,
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns the absolute path of the folder
|
||||
#[allow(dead_code)]
|
||||
pub fn get_absolute_path(&self, root_path: &PathBuf) -> PathBuf {
|
||||
root_path.join(&self.path)
|
||||
}
|
||||
|
||||
/// Updates folder modification time
|
||||
pub fn touch(&mut self) {
|
||||
self.modified_at = std::time::SystemTime::now()
|
||||
.duration_since(std::time::UNIX_EPOCH)
|
||||
.unwrap()
|
||||
.as_secs();
|
||||
}
|
||||
}
|
3
src/domain/entities/mod.rs
Normal file
3
src/domain/entities/mod.rs
Normal file
@@ -0,0 +1,3 @@
|
||||
pub mod file;
|
||||
pub mod folder;
|
||||
|
3
src/domain/mod.rs
Normal file
3
src/domain/mod.rs
Normal file
@@ -0,0 +1,3 @@
|
||||
pub mod entities;
|
||||
pub mod repositories;
|
||||
pub mod services;
|
72
src/domain/repositories/file_repository.rs
Normal file
72
src/domain/repositories/file_repository.rs
Normal file
@@ -0,0 +1,72 @@
|
||||
use std::path::PathBuf;
|
||||
use async_trait::async_trait;
|
||||
use crate::domain::entities::file::File;
|
||||
|
||||
/// Error types for file repository operations
|
||||
#[derive(Debug, thiserror::Error)]
|
||||
#[allow(dead_code)]
|
||||
pub enum FileRepositoryError {
|
||||
#[error("File not found: {0}")]
|
||||
NotFound(String),
|
||||
|
||||
#[error("File already exists: {0}")]
|
||||
AlreadyExists(String),
|
||||
|
||||
#[error("Invalid file path: {0}")]
|
||||
InvalidPath(String),
|
||||
|
||||
#[error("IO Error: {0}")]
|
||||
IoError(#[from] std::io::Error),
|
||||
|
||||
#[error("Other error: {0}")]
|
||||
Other(String),
|
||||
}
|
||||
|
||||
/// Result type for file repository operations
|
||||
pub type FileRepositoryResult<T> = Result<T, FileRepositoryError>;
|
||||
|
||||
/// Repository interface for file operations (primary port)
|
||||
#[async_trait]
|
||||
pub trait FileRepository: Send + Sync + 'static {
|
||||
/// Gets a folder by its ID - helper method for file repository to work with folders
|
||||
#[allow(dead_code)]
|
||||
async fn get_folder_by_id(&self, id: &str) -> FileRepositoryResult<crate::domain::entities::folder::Folder>;
|
||||
/// Saves a file from bytes
|
||||
async fn save_file_from_bytes(
|
||||
&self,
|
||||
name: String,
|
||||
folder_id: Option<String>,
|
||||
content_type: String,
|
||||
content: Vec<u8>,
|
||||
) -> FileRepositoryResult<File>;
|
||||
|
||||
/// Saves a file with a specific ID
|
||||
#[allow(dead_code)]
|
||||
async fn save_file_with_id(
|
||||
&self,
|
||||
id: String,
|
||||
name: String,
|
||||
folder_id: Option<String>,
|
||||
content_type: String,
|
||||
content: Vec<u8>,
|
||||
) -> FileRepositoryResult<File>;
|
||||
|
||||
/// Gets a file by its ID
|
||||
async fn get_file_by_id(&self, id: &str) -> FileRepositoryResult<File>;
|
||||
|
||||
/// Lists files in a folder
|
||||
async fn list_files(&self, folder_id: Option<&str>) -> FileRepositoryResult<Vec<File>>;
|
||||
|
||||
/// Deletes a file
|
||||
async fn delete_file(&self, id: &str) -> FileRepositoryResult<()>;
|
||||
|
||||
/// Deletes a file and its entry from the map
|
||||
#[allow(dead_code)]
|
||||
async fn delete_file_entry(&self, id: &str) -> FileRepositoryResult<()>;
|
||||
|
||||
/// Gets file content as bytes
|
||||
async fn get_file_content(&self, id: &str) -> FileRepositoryResult<Vec<u8>>;
|
||||
|
||||
/// Checks if a file exists at the given path
|
||||
async fn file_exists(&self, path: &PathBuf) -> FileRepositoryResult<bool>;
|
||||
}
|
54
src/domain/repositories/folder_repository.rs
Normal file
54
src/domain/repositories/folder_repository.rs
Normal file
@@ -0,0 +1,54 @@
|
||||
use std::path::PathBuf;
|
||||
use async_trait::async_trait;
|
||||
use crate::domain::entities::folder::Folder;
|
||||
|
||||
/// Error types for folder repository operations
|
||||
#[derive(Debug, thiserror::Error)]
|
||||
#[allow(dead_code)]
|
||||
pub enum FolderRepositoryError {
|
||||
#[error("Folder not found: {0}")]
|
||||
NotFound(String),
|
||||
|
||||
#[error("Folder already exists: {0}")]
|
||||
AlreadyExists(String),
|
||||
|
||||
#[error("Invalid folder path: {0}")]
|
||||
InvalidPath(String),
|
||||
|
||||
#[error("IO Error: {0}")]
|
||||
IoError(#[from] std::io::Error),
|
||||
|
||||
#[error("Other error: {0}")]
|
||||
Other(String),
|
||||
}
|
||||
|
||||
/// Result type for folder repository operations
|
||||
pub type FolderRepositoryResult<T> = Result<T, FolderRepositoryError>;
|
||||
|
||||
/// Repository interface for folder operations (primary port)
|
||||
#[async_trait]
|
||||
pub trait FolderRepository: Send + Sync + 'static {
|
||||
/// Creates a new folder
|
||||
async fn create_folder(&self, name: String, parent_path: Option<PathBuf>) -> FolderRepositoryResult<Folder>;
|
||||
|
||||
/// Gets a folder by its ID
|
||||
async fn get_folder_by_id(&self, id: &str) -> FolderRepositoryResult<Folder>;
|
||||
|
||||
/// Gets a folder by its path
|
||||
async fn get_folder_by_path(&self, path: &PathBuf) -> FolderRepositoryResult<Folder>;
|
||||
|
||||
/// Lists folders in a parent folder
|
||||
async fn list_folders(&self, parent_id: Option<&str>) -> FolderRepositoryResult<Vec<Folder>>;
|
||||
|
||||
/// Renames a folder
|
||||
async fn rename_folder(&self, id: &str, new_name: String) -> FolderRepositoryResult<Folder>;
|
||||
|
||||
/// Moves a folder to a new parent
|
||||
async fn move_folder(&self, id: &str, new_parent_id: Option<&str>) -> FolderRepositoryResult<Folder>;
|
||||
|
||||
/// Deletes a folder
|
||||
async fn delete_folder(&self, id: &str) -> FolderRepositoryResult<()>;
|
||||
|
||||
/// Checks if a folder exists at the given path
|
||||
async fn folder_exists(&self, path: &PathBuf) -> FolderRepositoryResult<bool>;
|
||||
}
|
3
src/domain/repositories/mod.rs
Normal file
3
src/domain/repositories/mod.rs
Normal file
@@ -0,0 +1,3 @@
|
||||
pub mod file_repository;
|
||||
pub mod folder_repository;
|
||||
|
66
src/domain/services/i18n_service.rs
Normal file
66
src/domain/services/i18n_service.rs
Normal file
@@ -0,0 +1,66 @@
|
||||
use async_trait::async_trait;
|
||||
use thiserror::Error;
|
||||
|
||||
/// Error types for i18n service operations
|
||||
#[derive(Debug, Error)]
|
||||
pub enum I18nError {
|
||||
#[error("Translation key not found: {0}")]
|
||||
KeyNotFound(String),
|
||||
|
||||
#[error("Invalid locale: {0}")]
|
||||
InvalidLocale(String),
|
||||
|
||||
#[error("Error loading translations: {0}")]
|
||||
LoadError(String),
|
||||
}
|
||||
|
||||
/// Result type for i18n service operations
|
||||
pub type I18nResult<T> = Result<T, I18nError>;
|
||||
|
||||
/// Supported locales
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
|
||||
pub enum Locale {
|
||||
English,
|
||||
Spanish,
|
||||
}
|
||||
|
||||
impl Locale {
|
||||
/// Convert locale to code string
|
||||
pub fn as_str(&self) -> &'static str {
|
||||
match self {
|
||||
Locale::English => "en",
|
||||
Locale::Spanish => "es",
|
||||
}
|
||||
}
|
||||
|
||||
/// Create from locale code string
|
||||
pub fn from_str(code: &str) -> Option<Self> {
|
||||
match code.to_lowercase().as_str() {
|
||||
"en" => Some(Locale::English),
|
||||
"es" => Some(Locale::Spanish),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
/// Get default locale
|
||||
pub fn default() -> Self {
|
||||
Locale::English
|
||||
}
|
||||
}
|
||||
|
||||
/// Interface for i18n service (primary port)
|
||||
#[async_trait]
|
||||
pub trait I18nService: Send + Sync + 'static {
|
||||
/// Get a translation for a key and locale
|
||||
async fn translate(&self, key: &str, locale: Locale) -> I18nResult<String>;
|
||||
|
||||
/// Load translations for a locale
|
||||
async fn load_translations(&self, locale: Locale) -> I18nResult<()>;
|
||||
|
||||
/// Get available locales
|
||||
async fn available_locales(&self) -> Vec<Locale>;
|
||||
|
||||
/// Check if a locale is supported
|
||||
#[allow(dead_code)]
|
||||
async fn is_supported(&self, locale: Locale) -> bool;
|
||||
}
|
1
src/domain/services/mod.rs
Normal file
1
src/domain/services/mod.rs
Normal file
@@ -0,0 +1 @@
|
||||
pub mod i18n_service;
|
3
src/infrastructure/mod.rs
Normal file
3
src/infrastructure/mod.rs
Normal file
@@ -0,0 +1,3 @@
|
||||
pub mod repositories;
|
||||
pub mod services;
|
||||
|
723
src/infrastructure/repositories/file_fs_repository.rs
Normal file
723
src/infrastructure/repositories/file_fs_repository.rs
Normal file
@@ -0,0 +1,723 @@
|
||||
use std::path::{Path, PathBuf};
|
||||
use std::sync::{Arc, Mutex};
|
||||
use std::collections::HashMap;
|
||||
use async_trait::async_trait;
|
||||
use tokio::{fs, io::AsyncWriteExt};
|
||||
use uuid::Uuid;
|
||||
use mime_guess::from_path;
|
||||
use serde::{Serialize, Deserialize};
|
||||
|
||||
use crate::domain::entities::file::File;
|
||||
use crate::domain::repositories::file_repository::{
|
||||
FileRepository, FileRepositoryError, FileRepositoryResult
|
||||
};
|
||||
use crate::domain::repositories::folder_repository::FolderRepository;
|
||||
|
||||
/// Structure to store file IDs mapped to their paths
|
||||
#[derive(Serialize, Deserialize, Debug, Default)]
|
||||
struct FileIdMap {
|
||||
path_to_id: HashMap<String, String>,
|
||||
}
|
||||
|
||||
/// Filesystem implementation of the FileRepository interface
|
||||
pub struct FileFsRepository {
|
||||
root_path: PathBuf,
|
||||
folder_repository: Arc<dyn FolderRepository>,
|
||||
id_map: Mutex<FileIdMap>,
|
||||
}
|
||||
|
||||
impl FileFsRepository {
|
||||
/// Creates a new filesystem-based file repository
|
||||
pub fn new(root_path: PathBuf, folder_repository: Arc<dyn FolderRepository>) -> Self {
|
||||
let id_map = Mutex::new(Self::load_id_map(&root_path));
|
||||
Self { root_path, folder_repository, id_map }
|
||||
}
|
||||
|
||||
/// Loads the file ID map from disk
|
||||
fn load_id_map(root_path: &PathBuf) -> FileIdMap {
|
||||
let map_path = root_path.join("file_ids.json");
|
||||
|
||||
if map_path.exists() {
|
||||
match std::fs::read_to_string(&map_path) {
|
||||
Ok(content) => {
|
||||
match serde_json::from_str::<FileIdMap>(&content) {
|
||||
Ok(map) => {
|
||||
tracing::info!("Loaded file ID map with {} entries", map.path_to_id.len());
|
||||
return map;
|
||||
},
|
||||
Err(e) => {
|
||||
tracing::error!("Error parsing file ID map: {}", e);
|
||||
}
|
||||
}
|
||||
},
|
||||
Err(e) => {
|
||||
tracing::error!("Error reading file ID map: {}", e);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Return empty map if file doesn't exist or there was an error
|
||||
FileIdMap::default()
|
||||
}
|
||||
|
||||
/// Saves the file ID map to disk
|
||||
fn save_id_map(&self) {
|
||||
let map_path = self.root_path.join("file_ids.json");
|
||||
|
||||
let map = self.id_map.lock().unwrap();
|
||||
match serde_json::to_string_pretty(&*map) {
|
||||
Ok(json) => {
|
||||
match std::fs::write(&map_path, json) {
|
||||
Ok(_) => {
|
||||
tracing::info!("Saved file ID map with {} entries", map.path_to_id.len());
|
||||
},
|
||||
Err(e) => {
|
||||
tracing::error!("Error writing file ID map: {}", e);
|
||||
}
|
||||
}
|
||||
},
|
||||
Err(e) => {
|
||||
tracing::error!("Error serializing file ID map: {}", e);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Generates a unique ID for a file
|
||||
fn generate_id(&self) -> String {
|
||||
Uuid::new_v4().to_string()
|
||||
}
|
||||
|
||||
/// Gets ID for a file path or generates a new one
|
||||
fn get_or_create_id(&self, path: &Path) -> String {
|
||||
let path_str = path.to_string_lossy().to_string();
|
||||
|
||||
let mut map = self.id_map.lock().unwrap();
|
||||
|
||||
// Check if we already have an ID for this path
|
||||
if let Some(id) = map.path_to_id.get(&path_str) {
|
||||
return id.clone();
|
||||
}
|
||||
|
||||
// Generate a new ID
|
||||
let id = self.generate_id();
|
||||
map.path_to_id.insert(path_str, id.clone());
|
||||
|
||||
// No need to save immediately - we'll save when file operations complete
|
||||
|
||||
id
|
||||
}
|
||||
|
||||
/// Resolves a relative path to an absolute path
|
||||
fn resolve_path(&self, relative_path: &Path) -> PathBuf {
|
||||
self.root_path.join(relative_path)
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl FileRepository for FileFsRepository {
|
||||
async fn get_folder_by_id(&self, id: &str) -> FileRepositoryResult<crate::domain::entities::folder::Folder> {
|
||||
match self.folder_repository.get_folder_by_id(id).await {
|
||||
Ok(folder) => Ok(folder),
|
||||
Err(e) => Err(FileRepositoryError::Other(format!("Folder not found: {}", e))),
|
||||
}
|
||||
}
|
||||
async fn save_file_from_bytes(
|
||||
&self,
|
||||
name: String,
|
||||
folder_id: Option<String>,
|
||||
content_type: String,
|
||||
content: Vec<u8>,
|
||||
) -> FileRepositoryResult<File>
|
||||
{
|
||||
// Get the folder path
|
||||
let folder_path = match &folder_id {
|
||||
Some(id) => {
|
||||
match self.folder_repository.get_folder_by_id(id).await {
|
||||
Ok(folder) => {
|
||||
tracing::info!("Using folder path: {:?} for folder_id: {:?}", folder.path, id);
|
||||
folder.path
|
||||
},
|
||||
Err(e) => {
|
||||
tracing::error!("Error getting folder: {}", e);
|
||||
PathBuf::new()
|
||||
},
|
||||
}
|
||||
},
|
||||
None => PathBuf::new(),
|
||||
};
|
||||
|
||||
// Create the file path
|
||||
let file_path = if folder_path.as_os_str().is_empty() {
|
||||
PathBuf::from(&name)
|
||||
} else {
|
||||
folder_path.join(&name)
|
||||
};
|
||||
tracing::info!("Created file path: {:?}", file_path);
|
||||
|
||||
// Check if file already exists
|
||||
let exists = self.file_exists(&file_path).await?;
|
||||
tracing::info!("File exists check: {} for path: {:?}", exists, file_path);
|
||||
|
||||
if exists {
|
||||
tracing::warn!("File already exists at path: {:?}", file_path);
|
||||
return Err(FileRepositoryError::AlreadyExists(file_path.to_string_lossy().to_string()));
|
||||
}
|
||||
|
||||
// Create parent directories if they don't exist
|
||||
let abs_path = self.resolve_path(&file_path);
|
||||
if let Some(parent) = abs_path.parent() {
|
||||
fs::create_dir_all(parent).await
|
||||
.map_err(FileRepositoryError::IoError)?;
|
||||
}
|
||||
|
||||
// Write the file
|
||||
let mut file = fs::File::create(&abs_path).await
|
||||
.map_err(FileRepositoryError::IoError)?;
|
||||
|
||||
file.write_all(&content).await
|
||||
.map_err(FileRepositoryError::IoError)?;
|
||||
|
||||
// Get file metadata
|
||||
let metadata = fs::metadata(&abs_path).await
|
||||
.map_err(FileRepositoryError::IoError)?;
|
||||
|
||||
// Determine the MIME type
|
||||
let mime_type = if content_type.is_empty() {
|
||||
from_path(&file_path)
|
||||
.first_or_octet_stream()
|
||||
.to_string()
|
||||
} else {
|
||||
content_type
|
||||
};
|
||||
|
||||
// Create and return the file entity with a persistent ID
|
||||
let id = self.get_or_create_id(&file_path);
|
||||
let file = File::new(
|
||||
id,
|
||||
name,
|
||||
file_path.clone(),
|
||||
metadata.len(),
|
||||
mime_type,
|
||||
folder_id,
|
||||
);
|
||||
|
||||
// Save the ID map
|
||||
self.save_id_map();
|
||||
|
||||
tracing::info!("Saved file: {} with ID: {}", file_path.display(), file.id);
|
||||
Ok(file)
|
||||
}
|
||||
|
||||
async fn save_file_with_id(
|
||||
&self,
|
||||
id: String,
|
||||
name: String,
|
||||
folder_id: Option<String>,
|
||||
content_type: String,
|
||||
content: Vec<u8>,
|
||||
) -> FileRepositoryResult<File>
|
||||
{
|
||||
// Get the folder path
|
||||
let folder_path = match &folder_id {
|
||||
Some(fid) => {
|
||||
match self.folder_repository.get_folder_by_id(fid).await {
|
||||
Ok(folder) => {
|
||||
tracing::info!("Using folder path: {:?} for folder_id: {:?}", folder.path, fid);
|
||||
folder.path
|
||||
},
|
||||
Err(e) => {
|
||||
tracing::error!("Error getting folder: {}", e);
|
||||
PathBuf::new()
|
||||
},
|
||||
}
|
||||
},
|
||||
None => PathBuf::new(),
|
||||
};
|
||||
|
||||
// Create the file path
|
||||
let file_path = if folder_path.as_os_str().is_empty() {
|
||||
PathBuf::from(&name)
|
||||
} else {
|
||||
folder_path.join(&name)
|
||||
};
|
||||
tracing::info!("Created file path with ID: {:?} for file: {}", file_path, id);
|
||||
|
||||
// Check if file already exists (different from the one we're moving)
|
||||
let exists = self.file_exists(&file_path).await?;
|
||||
tracing::info!("File exists check: {} for path: {:?}", exists, file_path);
|
||||
|
||||
// For save_file_with_id, we'll force overwrite if needed
|
||||
if exists {
|
||||
tracing::warn!("File already exists at path: {:?} - will overwrite", file_path);
|
||||
// Delete the existing file
|
||||
let abs_path = self.resolve_path(&file_path);
|
||||
if let Err(e) = fs::remove_file(&abs_path).await {
|
||||
tracing::error!("Failed to delete existing file: {} - {}", abs_path.display(), e);
|
||||
return Err(FileRepositoryError::IoError(e));
|
||||
}
|
||||
}
|
||||
|
||||
// Create parent directories if they don't exist
|
||||
let abs_path = self.resolve_path(&file_path);
|
||||
if let Some(parent) = abs_path.parent() {
|
||||
fs::create_dir_all(parent).await
|
||||
.map_err(FileRepositoryError::IoError)?;
|
||||
}
|
||||
|
||||
// Write the file
|
||||
let mut file = fs::File::create(&abs_path).await
|
||||
.map_err(FileRepositoryError::IoError)?;
|
||||
|
||||
file.write_all(&content).await
|
||||
.map_err(FileRepositoryError::IoError)?;
|
||||
|
||||
// Get file metadata
|
||||
let metadata = fs::metadata(&abs_path).await
|
||||
.map_err(FileRepositoryError::IoError)?;
|
||||
|
||||
// Determine the MIME type
|
||||
let mime_type = if content_type.is_empty() {
|
||||
from_path(&file_path)
|
||||
.first_or_octet_stream()
|
||||
.to_string()
|
||||
} else {
|
||||
content_type
|
||||
};
|
||||
|
||||
// Create the file entity with the provided ID
|
||||
let file_entity = File::new(
|
||||
id.clone(),
|
||||
name,
|
||||
file_path.clone(),
|
||||
metadata.len(),
|
||||
mime_type,
|
||||
folder_id,
|
||||
);
|
||||
|
||||
// Update the ID map
|
||||
{
|
||||
let mut map = self.id_map.lock().unwrap();
|
||||
|
||||
// First, remove any existing entries for this ID
|
||||
let entries_to_remove: Vec<String> = map.path_to_id.iter()
|
||||
.filter(|(_, v)| **v == id)
|
||||
.map(|(k, _)| k.clone())
|
||||
.collect();
|
||||
|
||||
for key in entries_to_remove {
|
||||
tracing::info!("Removing old mapping: {} -> {}", key, id);
|
||||
map.path_to_id.remove(&key);
|
||||
}
|
||||
|
||||
// Then add the new entry
|
||||
let path_str = file_path.to_string_lossy().to_string();
|
||||
tracing::info!("Adding new mapping: {} -> {}", path_str, id);
|
||||
map.path_to_id.insert(path_str, id);
|
||||
}
|
||||
|
||||
// Save the ID map
|
||||
self.save_id_map();
|
||||
|
||||
tracing::info!("Saved file with specific ID: {} at path: {}", file_entity.id, file_path.display());
|
||||
Ok(file_entity)
|
||||
}
|
||||
|
||||
async fn get_file_by_id(&self, id: &str) -> FileRepositoryResult<File> {
|
||||
// Find path by ID in the map
|
||||
let path_str = {
|
||||
let map = self.id_map.lock().unwrap();
|
||||
match map.path_to_id.iter().find(|(_, v)| v == &id) {
|
||||
Some((path, _)) => path.clone(),
|
||||
None => {
|
||||
tracing::error!("No file found with ID: {}", id);
|
||||
return Err(FileRepositoryError::NotFound(id.to_string()));
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
// Convert path string to PathBuf
|
||||
let file_path = PathBuf::from(path_str);
|
||||
|
||||
// Check if file exists
|
||||
let abs_path = self.resolve_path(&file_path);
|
||||
if !abs_path.exists() || !abs_path.is_file() {
|
||||
tracing::error!("File not found at path: {}", file_path.display());
|
||||
return Err(FileRepositoryError::NotFound(format!("File {} not found at {}", id, file_path.display())));
|
||||
}
|
||||
|
||||
// Get file metadata
|
||||
let metadata = fs::metadata(&abs_path).await
|
||||
.map_err(|e| {
|
||||
tracing::error!("Error getting metadata: {}", e);
|
||||
FileRepositoryError::IoError(e)
|
||||
})?;
|
||||
|
||||
// Get file name
|
||||
let name = match file_path.file_name() {
|
||||
Some(os_str) => os_str.to_string_lossy().to_string(),
|
||||
None => {
|
||||
tracing::error!("Invalid file path: {}", file_path.display());
|
||||
return Err(FileRepositoryError::InvalidPath(file_path.to_string_lossy().to_string()));
|
||||
}
|
||||
};
|
||||
|
||||
// Determine parent folder ID
|
||||
let parent_dir = file_path.parent().unwrap_or(Path::new(""));
|
||||
let folder_id = if parent_dir.as_os_str().is_empty() {
|
||||
None
|
||||
} else {
|
||||
let parent_path_buf = PathBuf::from(parent_dir);
|
||||
match self.folder_repository.get_folder_by_path(&parent_path_buf).await {
|
||||
Ok(folder) => Some(folder.id),
|
||||
Err(_) => None,
|
||||
}
|
||||
};
|
||||
|
||||
// Determine MIME type
|
||||
let mime_type = from_path(&file_path)
|
||||
.first_or_octet_stream()
|
||||
.to_string();
|
||||
|
||||
// Create file entity
|
||||
let mut file = File::new(
|
||||
id.to_string(),
|
||||
name,
|
||||
file_path,
|
||||
metadata.len(),
|
||||
mime_type,
|
||||
folder_id,
|
||||
);
|
||||
|
||||
// Set timestamps if available
|
||||
if let Ok(created) = metadata.created() {
|
||||
if let Ok(since_epoch) = created.duration_since(std::time::UNIX_EPOCH) {
|
||||
file.created_at = since_epoch.as_secs();
|
||||
}
|
||||
}
|
||||
|
||||
if let Ok(modified) = metadata.modified() {
|
||||
if let Ok(since_epoch) = modified.duration_since(std::time::UNIX_EPOCH) {
|
||||
file.modified_at = since_epoch.as_secs();
|
||||
}
|
||||
}
|
||||
|
||||
Ok(file)
|
||||
}
|
||||
|
||||
async fn list_files(&self, folder_id: Option<&str>) -> FileRepositoryResult<Vec<File>> {
|
||||
let mut files = Vec::new();
|
||||
|
||||
tracing::info!("Listing files in folder_id: {:?}", folder_id);
|
||||
|
||||
// Get the folder path
|
||||
let folder_path = match folder_id {
|
||||
Some(id) => {
|
||||
match self.folder_repository.get_folder_by_id(id).await {
|
||||
Ok(folder) => {
|
||||
tracing::info!("Found folder with path: {:?}", folder.path);
|
||||
folder.path
|
||||
},
|
||||
Err(e) => {
|
||||
tracing::error!("Error getting folder by ID: {}: {}", id, e);
|
||||
return Ok(Vec::new());
|
||||
},
|
||||
}
|
||||
},
|
||||
None => PathBuf::new(),
|
||||
};
|
||||
|
||||
// Get the absolute folder path
|
||||
let abs_folder_path = self.resolve_path(&folder_path);
|
||||
tracing::info!("Absolute folder path: {:?}", abs_folder_path);
|
||||
|
||||
// Ensure the directory exists
|
||||
if !abs_folder_path.exists() || !abs_folder_path.is_dir() {
|
||||
tracing::error!("Directory does not exist or is not a directory: {:?}", abs_folder_path);
|
||||
return Ok(Vec::new());
|
||||
}
|
||||
|
||||
tracing::info!("Directory exists, reading contents");
|
||||
|
||||
// Alternative approach - check files in the map that belong to this folder_id
|
||||
let file_candidates: Vec<(String, String)> = {
|
||||
let map = self.id_map.lock().unwrap();
|
||||
tracing::info!("Checking map with {} entries for files in folder_id: {:?}", map.path_to_id.len(), folder_id);
|
||||
|
||||
// Filter by folder path prefix and collect filtered entries
|
||||
let candidates = map.path_to_id.iter()
|
||||
.filter(|(path_str, _)| {
|
||||
let path = PathBuf::from(path_str);
|
||||
|
||||
// Check if this file belongs to the requested folder
|
||||
match &folder_id {
|
||||
Some(_) => {
|
||||
// For specific folder, check if path starts with folder path
|
||||
let parent_path = path.parent().unwrap_or_else(|| Path::new(""));
|
||||
parent_path == folder_path
|
||||
},
|
||||
None => {
|
||||
// For root folder, check if file is directly in root (no parent or parent is empty)
|
||||
let parent = path.parent().unwrap_or_else(|| Path::new(""));
|
||||
parent.as_os_str().is_empty() || parent == Path::new(".")
|
||||
}
|
||||
}
|
||||
})
|
||||
.map(|(path, id)| (path.clone(), id.clone()))
|
||||
.collect();
|
||||
|
||||
candidates
|
||||
};
|
||||
|
||||
// Process candidates after releasing the mutex lock
|
||||
for (path_str, file_id) in file_candidates {
|
||||
let path = PathBuf::from(&path_str);
|
||||
tracing::info!("Found file in target folder: {} with ID: {}", path_str, file_id);
|
||||
|
||||
// Verify file exists physically
|
||||
let abs_path = self.resolve_path(&path);
|
||||
if !abs_path.exists() || !abs_path.is_file() {
|
||||
tracing::warn!("File in map doesn't exist physically: {} (ID: {})", path_str, file_id);
|
||||
continue;
|
||||
}
|
||||
|
||||
// Get file info
|
||||
match fs::metadata(&abs_path).await {
|
||||
Ok(metadata) => {
|
||||
let file_name = path.file_name()
|
||||
.map(|os_str| os_str.to_string_lossy().to_string())
|
||||
.unwrap_or_else(|| "unnamed".to_string());
|
||||
|
||||
// Determine MIME type
|
||||
let mime_type = from_path(&path)
|
||||
.first_or_octet_stream()
|
||||
.to_string();
|
||||
|
||||
// Get timestamps
|
||||
let created_at = metadata.created()
|
||||
.map(|time| time.duration_since(std::time::UNIX_EPOCH).unwrap().as_secs())
|
||||
.unwrap_or_else(|_| 0);
|
||||
|
||||
let modified_at = metadata.modified()
|
||||
.map(|time| time.duration_since(std::time::UNIX_EPOCH).unwrap().as_secs())
|
||||
.unwrap_or_else(|_| 0);
|
||||
|
||||
let mut file = File::new(
|
||||
file_id.clone(),
|
||||
file_name.clone(),
|
||||
path.clone(),
|
||||
metadata.len(),
|
||||
mime_type,
|
||||
folder_id.map(String::from),
|
||||
);
|
||||
|
||||
file.created_at = created_at;
|
||||
file.modified_at = modified_at;
|
||||
|
||||
tracing::info!("Adding file to result list: {} (path: {:?})", file.name, path);
|
||||
files.push(file);
|
||||
},
|
||||
Err(e) => {
|
||||
tracing::warn!("Failed to get metadata for file: {} - {}", path_str, e);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if !files.is_empty() {
|
||||
tracing::info!("Found {} files in folder {:?} from map", files.len(), folder_id);
|
||||
return Ok(files);
|
||||
}
|
||||
|
||||
// If we didn't find files in the map or the list is empty, fall back to directory scan
|
||||
tracing::info!("Scanning directory for files: {:?}", abs_folder_path);
|
||||
|
||||
// Read directory entries
|
||||
let mut entries = fs::read_dir(abs_folder_path).await
|
||||
.map_err(FileRepositoryError::IoError)?;
|
||||
|
||||
while let Some(entry) = entries.next_entry().await
|
||||
.map_err(FileRepositoryError::IoError)? {
|
||||
|
||||
let path = entry.path();
|
||||
tracing::info!("Found entry: {:?}", path);
|
||||
|
||||
let metadata = entry.metadata().await
|
||||
.map_err(FileRepositoryError::IoError)?;
|
||||
|
||||
// Only include files, not directories
|
||||
if metadata.is_file() {
|
||||
let file_name = entry.file_name().to_string_lossy().to_string();
|
||||
tracing::info!("Found file: {}", file_name);
|
||||
|
||||
let file_path = if folder_path.as_os_str().is_empty() {
|
||||
PathBuf::from(&file_name)
|
||||
} else {
|
||||
folder_path.join(&file_name)
|
||||
};
|
||||
|
||||
tracing::info!("File path (relative to root): {:?}", file_path);
|
||||
|
||||
// Determine MIME type
|
||||
let mime_type = from_path(&file_path)
|
||||
.first_or_octet_stream()
|
||||
.to_string();
|
||||
|
||||
// Get timestamps
|
||||
let created_at = metadata.created()
|
||||
.map(|time| time.duration_since(std::time::UNIX_EPOCH).unwrap().as_secs())
|
||||
.unwrap_or_else(|_| 0);
|
||||
|
||||
let modified_at = metadata.modified()
|
||||
.map(|time| time.duration_since(std::time::UNIX_EPOCH).unwrap().as_secs())
|
||||
.unwrap_or_else(|_| 0);
|
||||
|
||||
// Check if this file is already in our list (could happen with case-insensitive filesystems)
|
||||
let duplicate = files.iter().any(|f: &File| f.name.to_lowercase() == file_name.to_lowercase());
|
||||
if duplicate {
|
||||
tracing::warn!("Skipping duplicate file with name: {} (case-insensitive match)", file_name);
|
||||
continue;
|
||||
}
|
||||
|
||||
// Create file entity with persistent ID
|
||||
let id = self.get_or_create_id(&file_path);
|
||||
tracing::info!("Using ID for file: {} for path: {:?}", id, file_path);
|
||||
|
||||
// Check if file is a PDF - just for debugging
|
||||
if file_name.to_lowercase().ends_with(".pdf") {
|
||||
tracing::info!("PDF file detected: {} with ID: {}", file_name, id);
|
||||
}
|
||||
|
||||
let mut file = File::new(
|
||||
id,
|
||||
file_name,
|
||||
file_path.clone(),
|
||||
metadata.len(),
|
||||
mime_type,
|
||||
folder_id.map(String::from),
|
||||
);
|
||||
|
||||
file.created_at = created_at;
|
||||
file.modified_at = modified_at;
|
||||
|
||||
tracing::info!("Adding file to result list: {} (path: {:?})", file.name, file_path);
|
||||
files.push(file);
|
||||
} else {
|
||||
tracing::info!("Skipping directory: {:?}", path);
|
||||
}
|
||||
}
|
||||
|
||||
tracing::info!("Found {} files in folder {:?}", files.len(), folder_id);
|
||||
|
||||
// Let's see what's in the map
|
||||
{
|
||||
let map = self.id_map.lock().unwrap();
|
||||
tracing::info!("ID map has {} entries", map.path_to_id.len());
|
||||
for (path, id) in &map.path_to_id {
|
||||
tracing::info!("Map entry: {} -> {}", path, id);
|
||||
}
|
||||
}
|
||||
|
||||
Ok(files)
|
||||
}
|
||||
|
||||
async fn delete_file(&self, id: &str) -> FileRepositoryResult<()> {
|
||||
let file = self.get_file_by_id(id).await?;
|
||||
|
||||
// Delete the physical file
|
||||
let abs_path = self.resolve_path(&file.path);
|
||||
tracing::info!("Deleting physical file: {}", abs_path.display());
|
||||
|
||||
fs::remove_file(abs_path).await
|
||||
.map_err(FileRepositoryError::IoError)?;
|
||||
|
||||
tracing::info!("Physical file deleted successfully: {}", file.path.display());
|
||||
Ok(())
|
||||
}
|
||||
|
||||
async fn delete_file_entry(&self, id: &str) -> FileRepositoryResult<()> {
|
||||
let file = self.get_file_by_id(id).await?;
|
||||
|
||||
// Delete the physical file
|
||||
let abs_path = self.resolve_path(&file.path);
|
||||
tracing::info!("Deleting physical file and entry for ID: {}", id);
|
||||
|
||||
// Try to delete the file, but continue even if it fails
|
||||
let delete_result = fs::remove_file(&abs_path).await;
|
||||
match &delete_result {
|
||||
Ok(_) => tracing::info!("Physical file deleted successfully: {}", file.path.display()),
|
||||
Err(e) => tracing::warn!("Failed to delete physical file: {} - {}", file.path.display(), e),
|
||||
};
|
||||
|
||||
// Remove all entries for this ID from the map
|
||||
{
|
||||
let mut map = self.id_map.lock().unwrap();
|
||||
// We don't need the path string anymore since we're finding all entries for this ID
|
||||
|
||||
// Find all paths that map to this ID
|
||||
let paths_to_remove: Vec<String> = map.path_to_id.iter()
|
||||
.filter(|(_, v)| **v == id)
|
||||
.map(|(k, _)| k.clone())
|
||||
.collect();
|
||||
|
||||
// Remove each path
|
||||
for path in &paths_to_remove {
|
||||
tracing::info!("Removing map entry: {} -> {}", path, id);
|
||||
map.path_to_id.remove(path);
|
||||
}
|
||||
|
||||
tracing::info!("Removed {} map entries for ID: {}", paths_to_remove.len(), id);
|
||||
}
|
||||
|
||||
// Save the updated map
|
||||
self.save_id_map();
|
||||
|
||||
// Return success if we deleted the file, otherwise propagate the error
|
||||
if delete_result.is_ok() {
|
||||
Ok(())
|
||||
} else {
|
||||
// Still return Ok - we've removed the entry from the map,
|
||||
// and we want the operation to continue even if the file deletion failed
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
async fn get_file_content(&self, id: &str) -> FileRepositoryResult<Vec<u8>> {
|
||||
let file = self.get_file_by_id(id).await?;
|
||||
|
||||
// Read the file content
|
||||
let abs_path = self.resolve_path(&file.path);
|
||||
let content = fs::read(abs_path).await
|
||||
.map_err(FileRepositoryError::IoError)?;
|
||||
|
||||
Ok(content)
|
||||
}
|
||||
|
||||
async fn file_exists(&self, path: &PathBuf) -> FileRepositoryResult<bool> {
|
||||
let abs_path = self.resolve_path(path);
|
||||
|
||||
// Check if file exists and is a file (not a directory)
|
||||
let exists = abs_path.exists() && abs_path.is_file();
|
||||
|
||||
tracing::info!("Checking if file exists: {} - path: {}", exists, abs_path.display());
|
||||
|
||||
// If it exists, try to get metadata to verify it's accessible
|
||||
if exists {
|
||||
match fs::metadata(&abs_path).await {
|
||||
Ok(metadata) => {
|
||||
if metadata.is_file() {
|
||||
tracing::info!("File exists and is accessible: {}", abs_path.display());
|
||||
return Ok(true);
|
||||
} else {
|
||||
tracing::warn!("Path exists but is not a file: {}", abs_path.display());
|
||||
return Ok(false);
|
||||
}
|
||||
},
|
||||
Err(e) => {
|
||||
tracing::warn!("File exists but metadata check failed: {} - {}", abs_path.display(), e);
|
||||
return Ok(false);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(false)
|
||||
}
|
||||
}
|
399
src/infrastructure/repositories/folder_fs_repository.rs
Normal file
399
src/infrastructure/repositories/folder_fs_repository.rs
Normal file
@@ -0,0 +1,399 @@
|
||||
use std::path::{Path, PathBuf};
|
||||
use std::collections::HashMap;
|
||||
use std::sync::{Arc, Mutex};
|
||||
use async_trait::async_trait;
|
||||
use tokio::fs;
|
||||
use uuid::Uuid;
|
||||
use serde::{Serialize, Deserialize};
|
||||
|
||||
use crate::domain::entities::folder::Folder;
|
||||
use crate::domain::repositories::folder_repository::{
|
||||
FolderRepository, FolderRepositoryError, FolderRepositoryResult
|
||||
};
|
||||
|
||||
/// Estructura para almacenar la relación entre paths y IDs
|
||||
#[derive(Debug, Serialize, Deserialize, Default)]
|
||||
struct FolderIdMap {
|
||||
path_to_id: HashMap<String, String>,
|
||||
}
|
||||
|
||||
/// Filesystem implementation of the FolderRepository interface
|
||||
pub struct FolderFsRepository {
|
||||
root_path: PathBuf,
|
||||
id_map: Arc<Mutex<FolderIdMap>>,
|
||||
}
|
||||
|
||||
impl FolderFsRepository {
|
||||
/// Creates a new filesystem-based folder repository
|
||||
pub fn new(root_path: PathBuf) -> Self {
|
||||
// Crear o cargar el mapeo de IDs
|
||||
let id_map = Arc::new(Mutex::new(FolderIdMap::default()));
|
||||
|
||||
// Intentar cargar el mapeo existente si existe
|
||||
let map_path = root_path.join("folder_ids.json");
|
||||
if let Ok(contents) = std::fs::read_to_string(map_path) {
|
||||
if let Ok(loaded_map) = serde_json::from_str::<FolderIdMap>(&contents) {
|
||||
let mut map = id_map.lock().unwrap();
|
||||
*map = loaded_map;
|
||||
tracing::info!("Loaded folder ID map with {} entries", map.path_to_id.len());
|
||||
}
|
||||
}
|
||||
|
||||
Self { root_path, id_map }
|
||||
}
|
||||
|
||||
/// Guarda el mapeo de IDs a disco
|
||||
fn save_id_map(&self) {
|
||||
let map_path = self.root_path.join("folder_ids.json");
|
||||
let map = self.id_map.lock().unwrap();
|
||||
if let Ok(json) = serde_json::to_string_pretty(&*map) {
|
||||
std::fs::write(map_path, json).ok();
|
||||
}
|
||||
}
|
||||
|
||||
/// Obtiene o genera un ID para un path
|
||||
fn get_or_create_id(&self, path: &Path) -> String {
|
||||
let path_str = path.to_string_lossy().to_string();
|
||||
let mut map = self.id_map.lock().unwrap();
|
||||
|
||||
if let Some(id) = map.path_to_id.get(&path_str) {
|
||||
return id.clone();
|
||||
}
|
||||
|
||||
// Si no existe, genera un nuevo ID
|
||||
let id = Uuid::new_v4().to_string();
|
||||
map.path_to_id.insert(path_str, id.clone());
|
||||
|
||||
// Guardar el mapa actualizado
|
||||
drop(map); // Liberar el mutex antes de guardar
|
||||
self.save_id_map();
|
||||
|
||||
id
|
||||
}
|
||||
|
||||
/// Generates a unique ID for a folder
|
||||
#[allow(dead_code)]
|
||||
fn generate_id(&self) -> String {
|
||||
Uuid::new_v4().to_string()
|
||||
}
|
||||
|
||||
/// Resolves a relative path to an absolute path
|
||||
fn resolve_path(&self, relative_path: &Path) -> PathBuf {
|
||||
self.root_path.join(relative_path)
|
||||
}
|
||||
|
||||
/// Creates the physical directory on the filesystem
|
||||
async fn create_directory(&self, path: &Path) -> Result<(), std::io::Error> {
|
||||
fs::create_dir_all(path).await
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl FolderRepository for FolderFsRepository {
|
||||
async fn create_folder(&self, name: String, parent_path: Option<PathBuf>) -> FolderRepositoryResult<Folder> {
|
||||
// Calculate the new folder path
|
||||
let path = match &parent_path {
|
||||
Some(parent) => parent.join(&name),
|
||||
None => PathBuf::from(&name),
|
||||
};
|
||||
|
||||
// Check if folder already exists
|
||||
if self.folder_exists(&path).await? {
|
||||
return Err(FolderRepositoryError::AlreadyExists(path.to_string_lossy().to_string()));
|
||||
}
|
||||
|
||||
// Create the physical directory
|
||||
let abs_path = self.resolve_path(&path);
|
||||
self.create_directory(&abs_path).await
|
||||
.map_err(FolderRepositoryError::IoError)?;
|
||||
|
||||
// Determine parent ID if any
|
||||
let parent_id = if let Some(parent) = &parent_path {
|
||||
if !parent.as_os_str().is_empty() {
|
||||
let parent_folder = self.get_folder_by_path(parent).await?;
|
||||
Some(parent_folder.id)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
} else {
|
||||
None
|
||||
};
|
||||
|
||||
// Create and return the folder entity with a persisted ID
|
||||
let id = self.get_or_create_id(&path);
|
||||
let folder = Folder::new(id, name, path, parent_id);
|
||||
|
||||
tracing::debug!("Created folder with ID: {}", folder.id);
|
||||
Ok(folder)
|
||||
}
|
||||
|
||||
async fn get_folder_by_id(&self, id: &str) -> FolderRepositoryResult<Folder> {
|
||||
tracing::debug!("Buscando carpeta con ID: {}", id);
|
||||
|
||||
// First try to find the path associated with this ID
|
||||
let path_opt = {
|
||||
let map = self.id_map.lock().unwrap();
|
||||
// Invertir el mapeo para buscar por ID
|
||||
map.path_to_id.iter()
|
||||
.find_map(|(path, folder_id)| if folder_id == id { Some(path.clone()) } else { None })
|
||||
};
|
||||
|
||||
if let Some(path_str) = path_opt {
|
||||
let path = PathBuf::from(path_str);
|
||||
tracing::debug!("Encontrado path para ID {}: {:?}", id, path);
|
||||
return self.get_folder_by_path(&path).await;
|
||||
}
|
||||
|
||||
// Fallback: buscar a través de todas las carpetas
|
||||
tracing::debug!("ID {} no encontrado en el mapa, buscando a través de todas las carpetas", id);
|
||||
let all_folders = self.list_folders(None).await?;
|
||||
|
||||
// Imprimir IDs disponibles para depuración
|
||||
for folder in &all_folders {
|
||||
tracing::debug!("Carpeta disponible - ID: {}, Nombre: {}", folder.id, folder.name);
|
||||
}
|
||||
|
||||
// Find the folder with the matching ID
|
||||
all_folders.into_iter()
|
||||
.find(|folder| folder.id == id)
|
||||
.ok_or_else(|| FolderRepositoryError::NotFound(id.to_string()))
|
||||
}
|
||||
|
||||
async fn get_folder_by_path(&self, path: &PathBuf) -> FolderRepositoryResult<Folder> {
|
||||
// Check if the physical directory exists
|
||||
let abs_path = self.resolve_path(path);
|
||||
if !abs_path.exists() || !abs_path.is_dir() {
|
||||
return Err(FolderRepositoryError::NotFound(path.to_string_lossy().to_string()));
|
||||
}
|
||||
|
||||
// Extract folder name and parent path
|
||||
let name = path.file_name()
|
||||
.ok_or_else(|| FolderRepositoryError::InvalidPath(path.to_string_lossy().to_string()))?
|
||||
.to_string_lossy()
|
||||
.to_string();
|
||||
|
||||
let parent_path = path.parent().map(|p| p.to_path_buf());
|
||||
|
||||
// Determine parent ID if any
|
||||
let parent_id = if let Some(parent) = &parent_path {
|
||||
if !parent.as_os_str().is_empty() {
|
||||
match self.get_folder_by_path(parent).await {
|
||||
Ok(parent_folder) => Some(parent_folder.id),
|
||||
Err(_) => None,
|
||||
}
|
||||
} else {
|
||||
None
|
||||
}
|
||||
} else {
|
||||
None
|
||||
};
|
||||
|
||||
// Get a consistent ID for this path
|
||||
let id = self.get_or_create_id(path);
|
||||
tracing::debug!("Found folder with path: {:?}, assigned ID: {}", path, id);
|
||||
|
||||
// Get folder metadata for timestamps
|
||||
let metadata = fs::metadata(&abs_path).await
|
||||
.map_err(FolderRepositoryError::IoError)?;
|
||||
|
||||
let created_at = metadata.created()
|
||||
.map(|time| time.duration_since(std::time::UNIX_EPOCH).unwrap().as_secs())
|
||||
.unwrap_or_else(|_| 0);
|
||||
|
||||
let modified_at = metadata.modified()
|
||||
.map(|time| time.duration_since(std::time::UNIX_EPOCH).unwrap().as_secs())
|
||||
.unwrap_or_else(|_| 0);
|
||||
|
||||
// Create and return the folder entity
|
||||
let mut folder = Folder::new(id, name, path.clone(), parent_id);
|
||||
folder.created_at = created_at;
|
||||
folder.modified_at = modified_at;
|
||||
|
||||
Ok(folder)
|
||||
}
|
||||
|
||||
async fn list_folders(&self, parent_id: Option<&str>) -> FolderRepositoryResult<Vec<Folder>> {
|
||||
let parent_path = match parent_id {
|
||||
Some(id) => {
|
||||
let parent = self.get_folder_by_id(id).await?;
|
||||
parent.path
|
||||
},
|
||||
None => PathBuf::from(""),
|
||||
};
|
||||
|
||||
let abs_parent_path = self.resolve_path(&parent_path);
|
||||
let mut folders = Vec::new();
|
||||
|
||||
// Read directory entries
|
||||
let mut entries = fs::read_dir(abs_parent_path).await
|
||||
.map_err(FolderRepositoryError::IoError)?;
|
||||
|
||||
while let Some(entry) = entries.next_entry().await
|
||||
.map_err(FolderRepositoryError::IoError)? {
|
||||
|
||||
let metadata = entry.metadata().await
|
||||
.map_err(FolderRepositoryError::IoError)?;
|
||||
|
||||
// Only include directories
|
||||
if metadata.is_dir() {
|
||||
let path = if parent_path.as_os_str().is_empty() {
|
||||
PathBuf::from(entry.file_name())
|
||||
} else {
|
||||
parent_path.join(entry.file_name())
|
||||
};
|
||||
|
||||
match self.get_folder_by_path(&path).await {
|
||||
Ok(folder) => folders.push(folder),
|
||||
Err(_) => continue,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(folders)
|
||||
}
|
||||
|
||||
async fn rename_folder(&self, id: &str, new_name: String) -> FolderRepositoryResult<Folder> {
|
||||
let folder = self.get_folder_by_id(id).await?;
|
||||
tracing::debug!("Renombrando carpeta con ID: {}, Nombre: {}", id, folder.name);
|
||||
|
||||
// Calculate new path
|
||||
let parent_path = folder.path.parent()
|
||||
.map(|p| p.to_path_buf())
|
||||
.unwrap_or_else(|| PathBuf::from(""));
|
||||
|
||||
let new_path = if parent_path.as_os_str().is_empty() {
|
||||
PathBuf::from(&new_name)
|
||||
} else {
|
||||
parent_path.join(&new_name)
|
||||
};
|
||||
|
||||
// Check if target already exists
|
||||
if self.folder_exists(&new_path).await? {
|
||||
return Err(FolderRepositoryError::AlreadyExists(new_path.to_string_lossy().to_string()));
|
||||
}
|
||||
|
||||
// Rename the physical directory
|
||||
let abs_old_path = self.resolve_path(&folder.path);
|
||||
let abs_new_path = self.resolve_path(&new_path);
|
||||
|
||||
fs::rename(&abs_old_path, &abs_new_path).await
|
||||
.map_err(FolderRepositoryError::IoError)?;
|
||||
|
||||
// Actualizar el mapa de IDs - eliminar la entrada antigua y añadir la nueva
|
||||
let path_str = new_path.to_string_lossy().to_string();
|
||||
{
|
||||
let mut map = self.id_map.lock().unwrap();
|
||||
let old_path_str = folder.path.to_string_lossy().to_string();
|
||||
map.path_to_id.remove(&old_path_str);
|
||||
map.path_to_id.insert(path_str.clone(), id.to_string());
|
||||
}
|
||||
|
||||
// Guardar el mapa actualizado
|
||||
self.save_id_map();
|
||||
|
||||
// Create and return updated folder entity
|
||||
let mut updated_folder = Folder::new(
|
||||
folder.id.clone(),
|
||||
new_name,
|
||||
new_path.clone(),
|
||||
folder.parent_id.clone(),
|
||||
);
|
||||
updated_folder.created_at = folder.created_at;
|
||||
updated_folder.touch();
|
||||
|
||||
tracing::debug!("Carpeta renombrada exitosamente: ID={}, Nuevo nombre={}", id, updated_folder.name);
|
||||
Ok(updated_folder)
|
||||
}
|
||||
|
||||
async fn move_folder(&self, id: &str, new_parent_id: Option<&str>) -> FolderRepositoryResult<Folder> {
|
||||
let folder = self.get_folder_by_id(id).await?;
|
||||
tracing::debug!("Moviendo carpeta con ID: {}, Nombre: {}", id, folder.name);
|
||||
|
||||
// Get new parent path
|
||||
let new_parent_path = match new_parent_id {
|
||||
Some(parent_id) => {
|
||||
let parent = self.get_folder_by_id(parent_id).await?;
|
||||
parent.path
|
||||
},
|
||||
None => PathBuf::from(""),
|
||||
};
|
||||
|
||||
// Calculate new path
|
||||
let new_path = if new_parent_path.as_os_str().is_empty() {
|
||||
PathBuf::from(&folder.name)
|
||||
} else {
|
||||
new_parent_path.join(&folder.name)
|
||||
};
|
||||
|
||||
// Check if target already exists
|
||||
if self.folder_exists(&new_path).await? {
|
||||
return Err(FolderRepositoryError::AlreadyExists(new_path.to_string_lossy().to_string()));
|
||||
}
|
||||
|
||||
// Move the physical directory
|
||||
let abs_old_path = self.resolve_path(&folder.path);
|
||||
let abs_new_path = self.resolve_path(&new_path);
|
||||
|
||||
fs::rename(&abs_old_path, &abs_new_path).await
|
||||
.map_err(FolderRepositoryError::IoError)?;
|
||||
|
||||
// Actualizar el mapa de IDs - eliminar la entrada antigua y añadir la nueva
|
||||
let path_str = new_path.to_string_lossy().to_string();
|
||||
{
|
||||
let mut map = self.id_map.lock().unwrap();
|
||||
let old_path_str = folder.path.to_string_lossy().to_string();
|
||||
map.path_to_id.remove(&old_path_str);
|
||||
map.path_to_id.insert(path_str.clone(), id.to_string());
|
||||
}
|
||||
|
||||
// Guardar el mapa actualizado
|
||||
self.save_id_map();
|
||||
|
||||
// Create and return updated folder entity
|
||||
let new_parent_id = if let Some(parent_id) = new_parent_id {
|
||||
Some(parent_id.to_string())
|
||||
} else {
|
||||
None
|
||||
};
|
||||
|
||||
let mut updated_folder = Folder::new(
|
||||
folder.id.clone(),
|
||||
folder.name.clone(),
|
||||
new_path.clone(),
|
||||
new_parent_id,
|
||||
);
|
||||
updated_folder.created_at = folder.created_at;
|
||||
updated_folder.touch();
|
||||
|
||||
tracing::debug!("Carpeta movida exitosamente: ID={}, Nueva ruta={:?}", id, new_path);
|
||||
Ok(updated_folder)
|
||||
}
|
||||
|
||||
async fn delete_folder(&self, id: &str) -> FolderRepositoryResult<()> {
|
||||
let folder = self.get_folder_by_id(id).await?;
|
||||
tracing::debug!("Eliminando carpeta con ID: {}, Nombre: {}", id, folder.name);
|
||||
|
||||
// Delete the physical directory
|
||||
let abs_path = self.resolve_path(&folder.path);
|
||||
fs::remove_dir_all(abs_path).await
|
||||
.map_err(FolderRepositoryError::IoError)?;
|
||||
|
||||
// Actualizar el mapa de IDs - eliminar la entrada
|
||||
{
|
||||
let mut map = self.id_map.lock().unwrap();
|
||||
let path_str = folder.path.to_string_lossy().to_string();
|
||||
map.path_to_id.remove(&path_str);
|
||||
}
|
||||
|
||||
// Guardar el mapa actualizado
|
||||
self.save_id_map();
|
||||
|
||||
tracing::debug!("Carpeta eliminada exitosamente: ID={}", id);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
async fn folder_exists(&self, path: &PathBuf) -> FolderRepositoryResult<bool> {
|
||||
let abs_path = self.resolve_path(path);
|
||||
Ok(abs_path.exists() && abs_path.is_dir())
|
||||
}
|
||||
}
|
3
src/infrastructure/repositories/mod.rs
Normal file
3
src/infrastructure/repositories/mod.rs
Normal file
@@ -0,0 +1,3 @@
|
||||
pub mod file_fs_repository;
|
||||
pub mod folder_fs_repository;
|
||||
|
141
src/infrastructure/services/file_system_i18n_service.rs
Normal file
141
src/infrastructure/services/file_system_i18n_service.rs
Normal file
@@ -0,0 +1,141 @@
|
||||
use std::collections::HashMap;
|
||||
use std::path::PathBuf;
|
||||
use std::sync::RwLock;
|
||||
use async_trait::async_trait;
|
||||
use serde_json::Value;
|
||||
use tokio::fs;
|
||||
|
||||
use crate::domain::services::i18n_service::{I18nService, I18nError, I18nResult, Locale};
|
||||
|
||||
/// File system implementation of the I18nService
|
||||
pub struct FileSystemI18nService {
|
||||
/// Base directory containing translation files
|
||||
translations_dir: PathBuf,
|
||||
|
||||
/// Cached translations (locale code -> JSON data)
|
||||
cache: RwLock<HashMap<Locale, Value>>,
|
||||
}
|
||||
|
||||
impl FileSystemI18nService {
|
||||
/// Creates a new file system i18n service
|
||||
pub fn new(translations_dir: PathBuf) -> Self {
|
||||
Self {
|
||||
translations_dir,
|
||||
cache: RwLock::new(HashMap::new()),
|
||||
}
|
||||
}
|
||||
|
||||
/// Get translation file path for a locale
|
||||
fn get_locale_file_path(&self, locale: Locale) -> PathBuf {
|
||||
self.translations_dir.join(format!("{}.json", locale.as_str()))
|
||||
}
|
||||
|
||||
/// Get a nested key from JSON data
|
||||
fn get_nested_value(&self, data: &Value, key: &str) -> Option<String> {
|
||||
let parts: Vec<&str> = key.split('.').collect();
|
||||
let mut current = data;
|
||||
|
||||
for part in &parts[0..parts.len() - 1] {
|
||||
if let Some(next) = current.get(part) {
|
||||
current = next;
|
||||
} else {
|
||||
return None;
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(last_part) = parts.last() {
|
||||
if let Some(value) = current.get(last_part) {
|
||||
if value.is_string() {
|
||||
return value.as_str().map(|s| s.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl I18nService for FileSystemI18nService {
|
||||
async fn translate(&self, key: &str, locale: Locale) -> I18nResult<String> {
|
||||
// Check if translations are cached
|
||||
{
|
||||
let cache = self.cache.read().unwrap();
|
||||
if let Some(translations) = cache.get(&locale) {
|
||||
if let Some(value) = self.get_nested_value(translations, key) {
|
||||
return Ok(value);
|
||||
}
|
||||
|
||||
// Try to use English as fallback if we couldn't find the key
|
||||
if locale != Locale::English {
|
||||
if let Some(english_translations) = cache.get(&Locale::English) {
|
||||
if let Some(value) = self.get_nested_value(english_translations, key) {
|
||||
return Ok(value);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return Err(I18nError::KeyNotFound(key.to_string()));
|
||||
}
|
||||
}
|
||||
|
||||
// If not cached, load translations and try again
|
||||
self.load_translations(locale).await?;
|
||||
|
||||
{
|
||||
let cache = self.cache.read().unwrap();
|
||||
if let Some(translations) = cache.get(&locale) {
|
||||
if let Some(value) = self.get_nested_value(translations, key) {
|
||||
return Ok(value);
|
||||
}
|
||||
|
||||
// Try to use English as fallback
|
||||
if locale != Locale::English {
|
||||
if let Some(english_translations) = cache.get(&Locale::English) {
|
||||
if let Some(value) = self.get_nested_value(english_translations, key) {
|
||||
return Ok(value);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Err(I18nError::KeyNotFound(key.to_string()))
|
||||
}
|
||||
|
||||
async fn load_translations(&self, locale: Locale) -> I18nResult<()> {
|
||||
let file_path = self.get_locale_file_path(locale);
|
||||
tracing::info!("Loading translations for locale {} from {:?}", locale.as_str(), file_path);
|
||||
|
||||
// Check if file exists
|
||||
if !file_path.exists() {
|
||||
return Err(I18nError::InvalidLocale(locale.as_str().to_string()));
|
||||
}
|
||||
|
||||
// Read and parse file
|
||||
let content = fs::read_to_string(&file_path)
|
||||
.await
|
||||
.map_err(|e| I18nError::LoadError(format!("Failed to read translation file: {}", e)))?;
|
||||
|
||||
let translations: Value = serde_json::from_str(&content)
|
||||
.map_err(|e| I18nError::LoadError(format!("Failed to parse translation file: {}", e)))?;
|
||||
|
||||
// Update cache
|
||||
{
|
||||
let mut cache = self.cache.write().unwrap();
|
||||
cache.insert(locale, translations);
|
||||
}
|
||||
|
||||
tracing::info!("Translations loaded for locale {}", locale.as_str());
|
||||
Ok(())
|
||||
}
|
||||
|
||||
async fn available_locales(&self) -> Vec<Locale> {
|
||||
vec![Locale::English, Locale::Spanish]
|
||||
}
|
||||
|
||||
async fn is_supported(&self, locale: Locale) -> bool {
|
||||
let file_path = self.get_locale_file_path(locale);
|
||||
file_path.exists()
|
||||
}
|
||||
}
|
1
src/infrastructure/services/mod.rs
Normal file
1
src/infrastructure/services/mod.rs
Normal file
@@ -0,0 +1 @@
|
||||
pub mod file_system_i18n_service;
|
210
src/interfaces/api/handlers/file_handler.rs
Normal file
210
src/interfaces/api/handlers/file_handler.rs
Normal file
@@ -0,0 +1,210 @@
|
||||
use std::sync::Arc;
|
||||
use axum::{
|
||||
extract::{Path, State, Multipart},
|
||||
http::{StatusCode, header},
|
||||
response::IntoResponse,
|
||||
Json,
|
||||
};
|
||||
use serde::Deserialize;
|
||||
|
||||
use crate::application::services::file_service::FileService;
|
||||
use crate::domain::repositories::file_repository::FileRepositoryError;
|
||||
|
||||
type AppState = Arc<FileService>;
|
||||
|
||||
/// Handler for file-related API endpoints
|
||||
pub struct FileHandler;
|
||||
|
||||
impl FileHandler {
|
||||
/// Uploads a file
|
||||
pub async fn upload_file(
|
||||
State(service): State<AppState>,
|
||||
mut multipart: Multipart,
|
||||
) -> impl IntoResponse {
|
||||
// Extract file from multipart request
|
||||
let mut file_part = None;
|
||||
let mut folder_id = None;
|
||||
|
||||
while let Some(field) = multipart.next_field().await.unwrap_or(None) {
|
||||
let name = field.name().unwrap_or("").to_string();
|
||||
|
||||
if name == "file" {
|
||||
file_part = Some((
|
||||
field.file_name().unwrap_or("unnamed").to_string(),
|
||||
field.content_type().unwrap_or("application/octet-stream").to_string(),
|
||||
field.bytes().await.unwrap_or_default(),
|
||||
));
|
||||
} else if name == "folder_id" {
|
||||
let folder_id_value = field.text().await.unwrap_or_default();
|
||||
if !folder_id_value.is_empty() {
|
||||
folder_id = Some(folder_id_value);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Check if file was provided
|
||||
if let Some((filename, content_type, data)) = file_part {
|
||||
// Upload file from bytes
|
||||
match service.upload_file_from_bytes(filename, folder_id, content_type, data.to_vec()).await {
|
||||
Ok(file) => (StatusCode::CREATED, Json(file)).into_response(),
|
||||
Err(err) => {
|
||||
let status = match &err {
|
||||
FileRepositoryError::AlreadyExists(_) => StatusCode::CONFLICT,
|
||||
FileRepositoryError::NotFound(_) => StatusCode::NOT_FOUND,
|
||||
_ => StatusCode::INTERNAL_SERVER_ERROR,
|
||||
};
|
||||
|
||||
(status, Json(serde_json::json!({
|
||||
"error": err.to_string()
|
||||
}))).into_response()
|
||||
}
|
||||
}
|
||||
} else {
|
||||
(StatusCode::BAD_REQUEST, Json(serde_json::json!({
|
||||
"error": "No file provided"
|
||||
}))).into_response()
|
||||
}
|
||||
}
|
||||
|
||||
/// Downloads a file
|
||||
pub async fn download_file(
|
||||
State(service): State<AppState>,
|
||||
Path(id): Path<String>,
|
||||
) -> impl IntoResponse {
|
||||
// Get file info and content
|
||||
let file_result = service.get_file(&id).await;
|
||||
let content_result = service.get_file_content(&id).await;
|
||||
|
||||
match (file_result, content_result) {
|
||||
(Ok(file), Ok(content)) => {
|
||||
// Create response with proper headers
|
||||
let headers = [
|
||||
(header::CONTENT_TYPE, file.mime_type),
|
||||
(header::CONTENT_DISPOSITION, format!("attachment; filename=\"{}\"", file.name)),
|
||||
];
|
||||
|
||||
(StatusCode::OK, headers, content).into_response()
|
||||
},
|
||||
(Err(err), _) | (_, Err(err)) => {
|
||||
let status = match &err {
|
||||
FileRepositoryError::NotFound(_) => StatusCode::NOT_FOUND,
|
||||
_ => StatusCode::INTERNAL_SERVER_ERROR,
|
||||
};
|
||||
|
||||
(status, Json(serde_json::json!({
|
||||
"error": err.to_string()
|
||||
}))).into_response()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Lists files, optionally filtered by folder ID
|
||||
pub async fn list_files(
|
||||
State(service): State<AppState>,
|
||||
folder_id: Option<&str>,
|
||||
) -> impl IntoResponse {
|
||||
match service.list_files(folder_id).await {
|
||||
Ok(files) => {
|
||||
// Always return an array even if empty
|
||||
(StatusCode::OK, Json(files)).into_response()
|
||||
},
|
||||
Err(err) => {
|
||||
let status = match &err {
|
||||
FileRepositoryError::NotFound(_) => StatusCode::NOT_FOUND,
|
||||
_ => StatusCode::INTERNAL_SERVER_ERROR,
|
||||
};
|
||||
|
||||
// Return a JSON error response
|
||||
(status, Json(serde_json::json!({
|
||||
"error": err.to_string()
|
||||
}))).into_response()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Deletes a file
|
||||
pub async fn delete_file(
|
||||
State(service): State<AppState>,
|
||||
Path(id): Path<String>,
|
||||
) -> impl IntoResponse {
|
||||
match service.delete_file(&id).await {
|
||||
Ok(_) => StatusCode::NO_CONTENT.into_response(),
|
||||
Err(err) => {
|
||||
let status = match &err {
|
||||
FileRepositoryError::NotFound(_) => StatusCode::NOT_FOUND,
|
||||
_ => StatusCode::INTERNAL_SERVER_ERROR,
|
||||
};
|
||||
|
||||
(status, Json(serde_json::json!({
|
||||
"error": err.to_string()
|
||||
}))).into_response()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Moves a file to a different folder
|
||||
pub async fn move_file(
|
||||
State(service): State<AppState>,
|
||||
Path(id): Path<String>,
|
||||
Json(payload): Json<MoveFilePayload>,
|
||||
) -> impl IntoResponse {
|
||||
tracing::info!("API request: Mover archivo con ID: {} a carpeta: {:?}", id, payload.folder_id);
|
||||
|
||||
// Primero verificar si el archivo existe
|
||||
match service.get_file(&id).await {
|
||||
Ok(file) => {
|
||||
tracing::info!("Archivo encontrado: {} (ID: {}), procediendo con la operación de mover", file.name, id);
|
||||
|
||||
// Para carpetas de destino, simplemente confiamos en que la
|
||||
// operación de mover verificará su existencia
|
||||
if let Some(folder_id) = &payload.folder_id {
|
||||
tracing::info!("Se intentará mover a carpeta: {}", folder_id);
|
||||
}
|
||||
|
||||
// Proceder con la operación de mover
|
||||
match service.move_file(&id, payload.folder_id).await {
|
||||
Ok(file) => {
|
||||
tracing::info!("Archivo movido exitosamente: {} (ID: {})", file.name, file.id);
|
||||
(StatusCode::OK, Json(file)).into_response()
|
||||
},
|
||||
Err(err) => {
|
||||
let status = match &err {
|
||||
FileRepositoryError::NotFound(_) => {
|
||||
tracing::error!("Error al mover archivo - no encontrado: {}", err);
|
||||
StatusCode::NOT_FOUND
|
||||
},
|
||||
FileRepositoryError::AlreadyExists(_) => {
|
||||
tracing::error!("Error al mover archivo - ya existe: {}", err);
|
||||
StatusCode::CONFLICT
|
||||
},
|
||||
_ => {
|
||||
tracing::error!("Error al mover archivo: {}", err);
|
||||
StatusCode::INTERNAL_SERVER_ERROR
|
||||
}
|
||||
};
|
||||
|
||||
(status, Json(serde_json::json!({
|
||||
"error": format!("Error al mover el archivo: {}", err.to_string()),
|
||||
"code": status.as_u16(),
|
||||
"details": format!("Error al mover archivo con ID: {} - {}", id, err)
|
||||
}))).into_response()
|
||||
}
|
||||
}
|
||||
},
|
||||
Err(err) => {
|
||||
tracing::error!("Error al encontrar archivo para mover - no existe: {} (ID: {})", err, id);
|
||||
(StatusCode::NOT_FOUND, Json(serde_json::json!({
|
||||
"error": format!("El archivo con ID: {} no existe", id),
|
||||
"code": StatusCode::NOT_FOUND.as_u16()
|
||||
}))).into_response()
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Payload for moving a file
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct MoveFilePayload {
|
||||
/// Target folder ID (None means root)
|
||||
pub folder_id: Option<String>,
|
||||
}
|
142
src/interfaces/api/handlers/folder_handler.rs
Normal file
142
src/interfaces/api/handlers/folder_handler.rs
Normal file
@@ -0,0 +1,142 @@
|
||||
use std::sync::Arc;
|
||||
use axum::{
|
||||
extract::{Path, State},
|
||||
http::StatusCode,
|
||||
response::IntoResponse,
|
||||
Json,
|
||||
};
|
||||
|
||||
use crate::application::services::folder_service::FolderService;
|
||||
use crate::application::dtos::folder_dto::{CreateFolderDto, RenameFolderDto, MoveFolderDto};
|
||||
use crate::domain::repositories::folder_repository::FolderRepositoryError;
|
||||
|
||||
type AppState = Arc<FolderService>;
|
||||
|
||||
/// Handler for folder-related API endpoints
|
||||
pub struct FolderHandler;
|
||||
|
||||
impl FolderHandler {
|
||||
/// Creates a new folder
|
||||
pub async fn create_folder(
|
||||
State(service): State<AppState>,
|
||||
Json(dto): Json<CreateFolderDto>,
|
||||
) -> impl IntoResponse {
|
||||
match service.create_folder(dto).await {
|
||||
Ok(folder) => (StatusCode::CREATED, Json(folder)).into_response(),
|
||||
Err(err) => {
|
||||
let status = match &err {
|
||||
FolderRepositoryError::AlreadyExists(_) => StatusCode::CONFLICT,
|
||||
FolderRepositoryError::NotFound(_) => StatusCode::NOT_FOUND,
|
||||
_ => StatusCode::INTERNAL_SERVER_ERROR,
|
||||
};
|
||||
|
||||
(status, err.to_string()).into_response()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Gets a folder by ID
|
||||
pub async fn get_folder(
|
||||
State(service): State<AppState>,
|
||||
Path(id): Path<String>,
|
||||
) -> impl IntoResponse {
|
||||
match service.get_folder(&id).await {
|
||||
Ok(folder) => (StatusCode::OK, Json(folder)).into_response(),
|
||||
Err(err) => {
|
||||
let status = match &err {
|
||||
FolderRepositoryError::NotFound(_) => StatusCode::NOT_FOUND,
|
||||
_ => StatusCode::INTERNAL_SERVER_ERROR,
|
||||
};
|
||||
|
||||
(status, err.to_string()).into_response()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Lists folders, optionally filtered by parent ID
|
||||
pub async fn list_folders(
|
||||
State(service): State<AppState>,
|
||||
parent_id: Option<&str>,
|
||||
) -> impl IntoResponse {
|
||||
// Parent ID is already a &str
|
||||
|
||||
match service.list_folders(parent_id).await {
|
||||
Ok(folders) => {
|
||||
// Always return an array even if empty
|
||||
(StatusCode::OK, Json(folders)).into_response()
|
||||
},
|
||||
Err(err) => {
|
||||
let status = match &err {
|
||||
FolderRepositoryError::NotFound(_) => StatusCode::NOT_FOUND,
|
||||
_ => StatusCode::INTERNAL_SERVER_ERROR,
|
||||
};
|
||||
|
||||
// Return a JSON error response
|
||||
(status, Json(serde_json::json!({
|
||||
"error": err.to_string()
|
||||
}))).into_response()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Renames a folder
|
||||
pub async fn rename_folder(
|
||||
State(service): State<AppState>,
|
||||
Path(id): Path<String>,
|
||||
Json(dto): Json<RenameFolderDto>,
|
||||
) -> impl IntoResponse {
|
||||
match service.rename_folder(&id, dto).await {
|
||||
Ok(folder) => (StatusCode::OK, Json(folder)).into_response(),
|
||||
Err(err) => {
|
||||
let status = match &err {
|
||||
FolderRepositoryError::NotFound(_) => StatusCode::NOT_FOUND,
|
||||
FolderRepositoryError::AlreadyExists(_) => StatusCode::CONFLICT,
|
||||
_ => StatusCode::INTERNAL_SERVER_ERROR,
|
||||
};
|
||||
|
||||
// Return a proper JSON error response
|
||||
(status, Json(serde_json::json!({
|
||||
"error": err.to_string()
|
||||
}))).into_response()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Moves a folder to a new parent
|
||||
pub async fn move_folder(
|
||||
State(service): State<AppState>,
|
||||
Path(id): Path<String>,
|
||||
Json(dto): Json<MoveFolderDto>,
|
||||
) -> impl IntoResponse {
|
||||
match service.move_folder(&id, dto).await {
|
||||
Ok(folder) => (StatusCode::OK, Json(folder)).into_response(),
|
||||
Err(err) => {
|
||||
let status = match &err {
|
||||
FolderRepositoryError::NotFound(_) => StatusCode::NOT_FOUND,
|
||||
FolderRepositoryError::AlreadyExists(_) => StatusCode::CONFLICT,
|
||||
_ => StatusCode::INTERNAL_SERVER_ERROR,
|
||||
};
|
||||
|
||||
(status, err.to_string()).into_response()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Deletes a folder
|
||||
pub async fn delete_folder(
|
||||
State(service): State<AppState>,
|
||||
Path(id): Path<String>,
|
||||
) -> impl IntoResponse {
|
||||
match service.delete_folder(&id).await {
|
||||
Ok(_) => StatusCode::NO_CONTENT.into_response(),
|
||||
Err(err) => {
|
||||
let status = match &err {
|
||||
FolderRepositoryError::NotFound(_) => StatusCode::NOT_FOUND,
|
||||
_ => StatusCode::INTERNAL_SERVER_ERROR,
|
||||
};
|
||||
|
||||
(status, err.to_string()).into_response()
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
98
src/interfaces/api/handlers/i18n_handler.rs
Normal file
98
src/interfaces/api/handlers/i18n_handler.rs
Normal file
@@ -0,0 +1,98 @@
|
||||
use std::sync::Arc;
|
||||
use axum::{
|
||||
extract::{State, Query},
|
||||
http::StatusCode,
|
||||
response::IntoResponse,
|
||||
Json,
|
||||
};
|
||||
|
||||
use crate::application::services::i18n_application_service::I18nApplicationService;
|
||||
use crate::application::dtos::i18n_dto::{LocaleDto, TranslationRequestDto, TranslationResponseDto, TranslationErrorDto};
|
||||
use crate::domain::services::i18n_service::{Locale, I18nError};
|
||||
|
||||
type AppState = Arc<I18nApplicationService>;
|
||||
|
||||
/// Handler for i18n-related API endpoints
|
||||
pub struct I18nHandler;
|
||||
|
||||
impl I18nHandler {
|
||||
/// Gets a list of available locales
|
||||
pub async fn get_locales(
|
||||
State(service): State<AppState>,
|
||||
) -> impl IntoResponse {
|
||||
let locales = service.available_locales().await;
|
||||
let locale_dtos: Vec<LocaleDto> = locales.into_iter().map(LocaleDto::from).collect();
|
||||
|
||||
(StatusCode::OK, Json(locale_dtos)).into_response()
|
||||
}
|
||||
|
||||
/// Translates a key to the requested locale
|
||||
pub async fn translate(
|
||||
State(service): State<AppState>,
|
||||
Query(query): Query<TranslationRequestDto>,
|
||||
) -> impl IntoResponse {
|
||||
let locale = match &query.locale {
|
||||
Some(locale_str) => {
|
||||
match Locale::from_str(locale_str) {
|
||||
Some(locale) => Some(locale),
|
||||
None => {
|
||||
let error = TranslationErrorDto {
|
||||
key: query.key.clone(),
|
||||
locale: locale_str.clone(),
|
||||
error: format!("Unsupported locale: {}", locale_str),
|
||||
};
|
||||
return (StatusCode::BAD_REQUEST, Json(error)).into_response();
|
||||
}
|
||||
}
|
||||
},
|
||||
None => None,
|
||||
};
|
||||
|
||||
match service.translate(&query.key, locale).await {
|
||||
Ok(text) => {
|
||||
let response = TranslationResponseDto {
|
||||
key: query.key,
|
||||
locale: locale.unwrap_or(Locale::default()).as_str().to_string(),
|
||||
text,
|
||||
};
|
||||
(StatusCode::OK, Json(response)).into_response()
|
||||
},
|
||||
Err(err) => {
|
||||
let status = match &err {
|
||||
I18nError::KeyNotFound(_) => StatusCode::NOT_FOUND,
|
||||
I18nError::InvalidLocale(_) => StatusCode::BAD_REQUEST,
|
||||
I18nError::LoadError(_) => StatusCode::INTERNAL_SERVER_ERROR,
|
||||
};
|
||||
|
||||
let error = TranslationErrorDto {
|
||||
key: query.key,
|
||||
locale: locale.unwrap_or(Locale::default()).as_str().to_string(),
|
||||
error: err.to_string(),
|
||||
};
|
||||
|
||||
(status, Json(error)).into_response()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Gets all translations for a locale
|
||||
pub async fn get_translations(
|
||||
State(_service): State<AppState>,
|
||||
locale_code: String,
|
||||
) -> impl IntoResponse {
|
||||
let locale = match Locale::from_str(&locale_code) {
|
||||
Some(locale) => locale,
|
||||
None => {
|
||||
return (StatusCode::BAD_REQUEST, Json(serde_json::json!({
|
||||
"error": format!("Unsupported locale: {}", locale_code)
|
||||
}))).into_response();
|
||||
}
|
||||
};
|
||||
|
||||
// This implementation is a bit weird, as we don't have a way to get all translations
|
||||
// We should improve the I18nService to support this
|
||||
(StatusCode::OK, Json(serde_json::json!({
|
||||
"locale": locale.as_str()
|
||||
}))).into_response()
|
||||
}
|
||||
}
|
4
src/interfaces/api/handlers/mod.rs
Normal file
4
src/interfaces/api/handlers/mod.rs
Normal file
@@ -0,0 +1,4 @@
|
||||
pub mod file_handler;
|
||||
pub mod folder_handler;
|
||||
pub mod i18n_handler;
|
||||
|
4
src/interfaces/api/mod.rs
Normal file
4
src/interfaces/api/mod.rs
Normal file
@@ -0,0 +1,4 @@
|
||||
pub mod handlers;
|
||||
pub mod routes;
|
||||
|
||||
pub use routes::create_api_routes;
|
74
src/interfaces/api/routes.rs
Normal file
74
src/interfaces/api/routes.rs
Normal file
@@ -0,0 +1,74 @@
|
||||
use std::sync::Arc;
|
||||
use axum::{
|
||||
routing::{get, post, put, delete},
|
||||
Router,
|
||||
extract::State,
|
||||
};
|
||||
use tower_http::{compression::CompressionLayer, trace::TraceLayer};
|
||||
|
||||
use crate::application::services::folder_service::FolderService;
|
||||
use crate::application::services::file_service::FileService;
|
||||
use crate::application::services::i18n_application_service::I18nApplicationService;
|
||||
use crate::interfaces::api::handlers::folder_handler::FolderHandler;
|
||||
use crate::interfaces::api::handlers::file_handler::FileHandler;
|
||||
use crate::interfaces::api::handlers::i18n_handler::I18nHandler;
|
||||
|
||||
/// Creates API routes for the application
|
||||
pub fn create_api_routes(
|
||||
folder_service: Arc<FolderService>,
|
||||
file_service: Arc<FileService>,
|
||||
i18n_service: Option<Arc<I18nApplicationService>>,
|
||||
) -> Router {
|
||||
let folders_router = Router::new()
|
||||
.route("/", post(FolderHandler::create_folder))
|
||||
.route("/", get(|State(service): State<Arc<FolderService>>| async move {
|
||||
// No parent ID means list root folders
|
||||
FolderHandler::list_folders(State(service), None).await
|
||||
}))
|
||||
.route("/{id}", get(FolderHandler::get_folder))
|
||||
.route("/{id}/rename", put(FolderHandler::rename_folder))
|
||||
.route("/{id}/move", put(FolderHandler::move_folder))
|
||||
.route("/{id}", delete(FolderHandler::delete_folder))
|
||||
.with_state(folder_service);
|
||||
|
||||
let files_router = Router::new()
|
||||
.route("/", get(|
|
||||
State(service): State<Arc<FileService>>,
|
||||
axum::extract::Query(params): axum::extract::Query<std::collections::HashMap<String, String>>,
|
||||
| async move {
|
||||
// Get folder_id from query parameter if present
|
||||
let folder_id = params.get("folder_id").map(|id| id.as_str());
|
||||
tracing::info!("API: Listando archivos con folder_id: {:?}", folder_id);
|
||||
FileHandler::list_files(State(service), folder_id).await
|
||||
}))
|
||||
.route("/upload", post(FileHandler::upload_file))
|
||||
.route("/{id}", get(FileHandler::download_file))
|
||||
.route("/{id}", delete(FileHandler::delete_file))
|
||||
.route("/{id}/move", put(FileHandler::move_file))
|
||||
.with_state(file_service);
|
||||
|
||||
// Create a router without the i18n routes
|
||||
let mut router = Router::new()
|
||||
.nest("/folders", folders_router)
|
||||
.nest("/files", files_router);
|
||||
|
||||
// Add i18n routes if the service is provided
|
||||
if let Some(i18n_service) = i18n_service {
|
||||
let i18n_router = Router::new()
|
||||
.route("/locales", get(I18nHandler::get_locales))
|
||||
.route("/translate", get(I18nHandler::translate))
|
||||
.route("/locales/{locale_code}", get(|
|
||||
State(service): State<Arc<I18nApplicationService>>,
|
||||
axum::extract::Path(locale_code): axum::extract::Path<String>,
|
||||
| async move {
|
||||
I18nHandler::get_translations(State(service), locale_code).await
|
||||
}))
|
||||
.with_state(i18n_service);
|
||||
|
||||
router = router.nest("/i18n", i18n_router);
|
||||
}
|
||||
|
||||
router
|
||||
.layer(CompressionLayer::new())
|
||||
.layer(TraceLayer::new_for_http())
|
||||
}
|
4
src/interfaces/mod.rs
Normal file
4
src/interfaces/mod.rs
Normal file
@@ -0,0 +1,4 @@
|
||||
pub mod api;
|
||||
pub mod web;
|
||||
|
||||
pub use api::create_api_routes;
|
11
src/interfaces/web/mod.rs
Normal file
11
src/interfaces/web/mod.rs
Normal file
@@ -0,0 +1,11 @@
|
||||
use axum::Router;
|
||||
use tower_http::services::ServeDir;
|
||||
use std::path::PathBuf;
|
||||
|
||||
/// Creates web routes for serving static files
|
||||
pub fn create_web_routes() -> Router {
|
||||
Router::new()
|
||||
.fallback_service(
|
||||
ServeDir::new(PathBuf::from("static"))
|
||||
)
|
||||
}
|
80
src/main.rs
Normal file
80
src/main.rs
Normal file
@@ -0,0 +1,80 @@
|
||||
use std::net::SocketAddr;
|
||||
use std::path::PathBuf;
|
||||
use std::sync::Arc;
|
||||
|
||||
use axum::Router;
|
||||
use axum::serve;
|
||||
use tower_http::trace::TraceLayer;
|
||||
use tracing_subscriber::{layer::SubscriberExt, util::SubscriberInitExt};
|
||||
|
||||
mod domain;
|
||||
mod application;
|
||||
mod infrastructure;
|
||||
mod interfaces;
|
||||
|
||||
use application::services::folder_service::FolderService;
|
||||
use application::services::file_service::FileService;
|
||||
use application::services::i18n_application_service::I18nApplicationService;
|
||||
use infrastructure::repositories::folder_fs_repository::FolderFsRepository;
|
||||
use infrastructure::repositories::file_fs_repository::FileFsRepository;
|
||||
use infrastructure::services::file_system_i18n_service::FileSystemI18nService;
|
||||
use interfaces::{create_api_routes, web::create_web_routes};
|
||||
|
||||
#[tokio::main]
|
||||
async fn main() {
|
||||
// Initialize tracing
|
||||
tracing_subscriber::registry()
|
||||
.with(tracing_subscriber::EnvFilter::new(
|
||||
std::env::var("RUST_LOG").unwrap_or_else(|_| "info".into()),
|
||||
))
|
||||
.with(tracing_subscriber::fmt::layer())
|
||||
.init();
|
||||
|
||||
// Set up storage directory
|
||||
let storage_path = PathBuf::from("./storage");
|
||||
if !storage_path.exists() {
|
||||
std::fs::create_dir_all(&storage_path).expect("Failed to create storage directory");
|
||||
}
|
||||
|
||||
// Set up locales directory
|
||||
let locales_path = PathBuf::from("./static/locales");
|
||||
if !locales_path.exists() {
|
||||
std::fs::create_dir_all(&locales_path).expect("Failed to create locales directory");
|
||||
}
|
||||
|
||||
// Initialize repositories
|
||||
let folder_repository = Arc::new(FolderFsRepository::new(storage_path.clone()));
|
||||
let file_repository = Arc::new(FileFsRepository::new(storage_path.clone(), folder_repository.clone()));
|
||||
|
||||
// Initialize services
|
||||
let folder_service = Arc::new(FolderService::new(folder_repository));
|
||||
let file_service = Arc::new(FileService::new(file_repository));
|
||||
|
||||
// Initialize i18n service
|
||||
let i18n_repository = Arc::new(FileSystemI18nService::new(locales_path));
|
||||
let i18n_service = Arc::new(I18nApplicationService::new(i18n_repository));
|
||||
|
||||
// Preload translations
|
||||
if let Err(e) = i18n_service.load_translations(domain::services::i18n_service::Locale::English).await {
|
||||
tracing::warn!("Failed to load English translations: {}", e);
|
||||
}
|
||||
if let Err(e) = i18n_service.load_translations(domain::services::i18n_service::Locale::Spanish).await {
|
||||
tracing::warn!("Failed to load Spanish translations: {}", e);
|
||||
}
|
||||
|
||||
// Build application router
|
||||
let api_routes = create_api_routes(folder_service, file_service, Some(i18n_service));
|
||||
let web_routes = create_web_routes();
|
||||
|
||||
let app = Router::new()
|
||||
.nest("/api", api_routes)
|
||||
.merge(web_routes)
|
||||
.layer(TraceLayer::new_for_http());
|
||||
|
||||
// Start server
|
||||
let addr = SocketAddr::from(([127, 0, 0, 1], 8085));
|
||||
tracing::info!("listening on {}", addr);
|
||||
|
||||
let listener = tokio::net::TcpListener::bind(&addr).await.unwrap();
|
||||
serve::serve(listener, app).await.unwrap();
|
||||
}
|
5
static/favicon.ico
Normal file
5
static/favicon.ico
Normal file
@@ -0,0 +1,5 @@
|
||||
<?xml version=\1.0\ encoding=\UTF-8\?>
|
||||
<svg width=\16\ height=\16\ xmlns=\http://www.w3.org/2000/svg\>
|
||||
<rect width=\16\ height=\16\ fill=\#556ee6\ rx=\3\ ry=\3\ />
|
||||
<path d=\M 3,4 V 12 H 13 V 6 H 8 L 6,4 Z\ fill=\#ffffff\ />
|
||||
</svg>
|
After Width: | Height: | Size: 235 B |
8
static/identifier.sh
Normal file
8
static/identifier.sh
Normal file
@@ -0,0 +1,8 @@
|
||||
while IFS= read -r -d '' file; do
|
||||
if grep -Iq . "$file"; then
|
||||
echo "===== $file ====="
|
||||
cat "$file"
|
||||
echo -e "\n"
|
||||
fi
|
||||
done < <(find . -type f -print0)
|
||||
|
1991
static/index.html
Normal file
1991
static/index.html
Normal file
File diff suppressed because it is too large
Load Diff
516
static/js/fileRenderer.js
Normal file
516
static/js/fileRenderer.js
Normal file
@@ -0,0 +1,516 @@
|
||||
/**
|
||||
* OxiCloud File Renderer Module
|
||||
* Optimized rendering for large file lists using virtual rendering
|
||||
*/
|
||||
|
||||
// Configuration
|
||||
const ITEMS_PER_PAGE = 100; // Number of items to render at once
|
||||
const ROW_HEIGHT = 80; // Estimated row height for list view
|
||||
const CARD_HEIGHT = 180; // Estimated height for grid view card
|
||||
const CARD_WIDTH = 180; // Estimated width for grid view card
|
||||
|
||||
class FileRenderer {
|
||||
constructor() {
|
||||
this.files = [];
|
||||
this.folders = [];
|
||||
this.currentView = 'grid';
|
||||
this.gridContainer = document.getElementById('files-grid');
|
||||
this.listContainer = document.getElementById('files-list-view');
|
||||
this.visibleItems = {};
|
||||
this.offsetY = 0;
|
||||
this.totalHeight = 0;
|
||||
this.gridColumns = 0;
|
||||
this.i18n = window.i18n || { t: key => key };
|
||||
|
||||
// Setup intersection observer for lazy loading
|
||||
this.setupIntersectionObserver();
|
||||
|
||||
// Handle scroll event for virtual scrolling
|
||||
this.handleScroll = this.handleScroll.bind(this);
|
||||
this.setupScrollListeners();
|
||||
}
|
||||
|
||||
/**
|
||||
* Set up intersection observer for lazy loading
|
||||
*/
|
||||
setupIntersectionObserver() {
|
||||
this.observer = new IntersectionObserver((entries) => {
|
||||
entries.forEach(entry => {
|
||||
if (entry.isIntersecting) {
|
||||
const elem = entry.target;
|
||||
if (elem.dataset.lazySrc) {
|
||||
elem.src = elem.dataset.lazySrc;
|
||||
delete elem.dataset.lazySrc;
|
||||
this.observer.unobserve(elem);
|
||||
}
|
||||
}
|
||||
});
|
||||
}, {
|
||||
rootMargin: '200px', // Load images 200px before they come into view
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Setup scroll listeners for virtual scrolling
|
||||
*/
|
||||
setupScrollListeners() {
|
||||
const container = document.querySelector('.files-list');
|
||||
if (container) {
|
||||
container.addEventListener('scroll', this.handleScroll);
|
||||
// Also update on resize
|
||||
window.addEventListener('resize', this.updateVisibleItems.bind(this));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Handle scroll events for virtual scrolling
|
||||
*/
|
||||
handleScroll() {
|
||||
const container = document.querySelector('.files-list');
|
||||
if (!container) return;
|
||||
|
||||
this.offsetY = container.scrollTop;
|
||||
requestAnimationFrame(() => this.updateVisibleItems());
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate how many items are visible and render only those
|
||||
*/
|
||||
updateVisibleItems() {
|
||||
if (!this.files || !this.folders) return;
|
||||
|
||||
const container = document.querySelector('.files-list');
|
||||
if (!container) return;
|
||||
|
||||
const viewportHeight = container.clientHeight;
|
||||
const viewportWidth = container.clientWidth;
|
||||
|
||||
// Calculate grid columns based on container width and card width
|
||||
this.gridColumns = Math.floor(viewportWidth / CARD_WIDTH);
|
||||
if (this.gridColumns < 1) this.gridColumns = 1;
|
||||
|
||||
if (this.currentView === 'grid') {
|
||||
this.updateGridView(viewportHeight);
|
||||
} else {
|
||||
this.updateListView(viewportHeight);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update grid view with virtual scrolling
|
||||
*/
|
||||
updateGridView(viewportHeight) {
|
||||
const allItems = [...this.folders, ...this.files];
|
||||
const rows = Math.ceil(allItems.length / this.gridColumns);
|
||||
this.totalHeight = rows * CARD_HEIGHT;
|
||||
|
||||
// Calculate visible range
|
||||
const startRow = Math.floor(this.offsetY / CARD_HEIGHT);
|
||||
const visibleRows = Math.ceil(viewportHeight / CARD_HEIGHT) + 1; // +1 for partial rows
|
||||
|
||||
const startIdx = startRow * this.gridColumns;
|
||||
const endIdx = Math.min(allItems.length, (startRow + visibleRows) * this.gridColumns);
|
||||
|
||||
// Generate a map of visible items
|
||||
const newVisibleItems = {};
|
||||
for (let i = startIdx; i < endIdx; i++) {
|
||||
newVisibleItems[i] = true;
|
||||
}
|
||||
|
||||
// Remove items that are no longer visible
|
||||
if (this.gridContainer) {
|
||||
const children = Array.from(this.gridContainer.children);
|
||||
children.forEach(child => {
|
||||
const idx = parseInt(child.dataset.index, 10);
|
||||
if (!newVisibleItems[idx]) {
|
||||
this.gridContainer.removeChild(child);
|
||||
} else {
|
||||
// Item is still visible, remove from new items list
|
||||
delete newVisibleItems[idx];
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// Add new visible items
|
||||
const fragment = document.createDocumentFragment();
|
||||
Object.keys(newVisibleItems).forEach(idx => {
|
||||
const i = parseInt(idx, 10);
|
||||
if (i < allItems.length) {
|
||||
const item = allItems[i];
|
||||
const elem = this.renderGridItem(item, i);
|
||||
fragment.appendChild(elem);
|
||||
}
|
||||
});
|
||||
|
||||
if (this.gridContainer) {
|
||||
this.gridContainer.appendChild(fragment);
|
||||
}
|
||||
|
||||
this.visibleItems = { ...this.visibleItems, ...newVisibleItems };
|
||||
}
|
||||
|
||||
/**
|
||||
* Update list view with virtual scrolling
|
||||
*/
|
||||
updateListView(viewportHeight) {
|
||||
const allItems = [...this.folders, ...this.files];
|
||||
this.totalHeight = allItems.length * ROW_HEIGHT;
|
||||
|
||||
// Calculate visible range
|
||||
const startIdx = Math.floor(this.offsetY / ROW_HEIGHT);
|
||||
const visibleCount = Math.ceil(viewportHeight / ROW_HEIGHT) + 1; // +1 for partial rows
|
||||
const endIdx = Math.min(allItems.length, startIdx + visibleCount);
|
||||
|
||||
// Generate a map of visible items
|
||||
const newVisibleItems = {};
|
||||
for (let i = startIdx; i < endIdx; i++) {
|
||||
newVisibleItems[i] = true;
|
||||
}
|
||||
|
||||
// Remove items that are no longer visible
|
||||
if (this.listContainer) {
|
||||
const children = Array.from(this.listContainer.children);
|
||||
// Skip the first child as it's the header
|
||||
for (let i = 1; i < children.length; i++) {
|
||||
const child = children[i];
|
||||
const idx = parseInt(child.dataset.index, 10);
|
||||
if (!newVisibleItems[idx]) {
|
||||
this.listContainer.removeChild(child);
|
||||
} else {
|
||||
// Item is still visible, remove from new items list
|
||||
delete newVisibleItems[idx];
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Add new visible items
|
||||
const fragment = document.createDocumentFragment();
|
||||
Object.keys(newVisibleItems).forEach(idx => {
|
||||
const i = parseInt(idx, 10);
|
||||
if (i < allItems.length) {
|
||||
const item = allItems[i];
|
||||
const elem = this.renderListItem(item, i);
|
||||
fragment.appendChild(elem);
|
||||
}
|
||||
});
|
||||
|
||||
if (this.listContainer) {
|
||||
this.listContainer.appendChild(fragment);
|
||||
}
|
||||
|
||||
this.visibleItems = { ...this.visibleItems, ...newVisibleItems };
|
||||
}
|
||||
|
||||
/**
|
||||
* Render a single grid item (file or folder)
|
||||
*/
|
||||
renderGridItem(item, index) {
|
||||
const elem = document.createElement('div');
|
||||
elem.className = 'file-card';
|
||||
elem.dataset.index = index;
|
||||
|
||||
const isFolder = 'parent_id' in item; // Folders have parent_id
|
||||
|
||||
if (isFolder) {
|
||||
elem.dataset.folderId = item.id;
|
||||
elem.dataset.folderName = item.name;
|
||||
elem.dataset.parentId = item.parent_id || "";
|
||||
|
||||
elem.innerHTML = `
|
||||
<div class="file-icon folder-icon">
|
||||
<i class="fas fa-folder"></i>
|
||||
</div>
|
||||
<div class="file-name">${item.name}</div>
|
||||
`;
|
||||
|
||||
// Make draggable
|
||||
if (item.parent_id) {
|
||||
elem.setAttribute('draggable', 'true');
|
||||
elem.addEventListener('dragstart', (e) => {
|
||||
e.dataTransfer.setData('text/plain', item.id);
|
||||
e.dataTransfer.setData('application/oxicloud-folder', 'true');
|
||||
elem.classList.add('dragging');
|
||||
});
|
||||
|
||||
elem.addEventListener('dragend', () => {
|
||||
elem.classList.remove('dragging');
|
||||
document.querySelectorAll('.drop-target').forEach(el => {
|
||||
el.classList.remove('drop-target');
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
// Click event
|
||||
elem.addEventListener('click', () => {
|
||||
if (typeof window.selectFolder === 'function') {
|
||||
window.selectFolder(item.id, item.name);
|
||||
}
|
||||
});
|
||||
|
||||
} else {
|
||||
// File card
|
||||
elem.dataset.fileId = item.id;
|
||||
elem.dataset.fileName = item.name;
|
||||
elem.dataset.folderId = item.folder_id || "";
|
||||
|
||||
// Determine icon based on MIME type
|
||||
let iconClass = 'fas fa-file';
|
||||
|
||||
if (item.mime_type) {
|
||||
if (item.mime_type.startsWith('image/')) {
|
||||
iconClass = 'fas fa-file-image';
|
||||
} else if (item.mime_type.startsWith('text/')) {
|
||||
iconClass = 'fas fa-file-alt';
|
||||
} else if (item.mime_type.startsWith('video/')) {
|
||||
iconClass = 'fas fa-file-video';
|
||||
} else if (item.mime_type.startsWith('audio/')) {
|
||||
iconClass = 'fas fa-file-audio';
|
||||
} else if (item.mime_type === 'application/pdf') {
|
||||
iconClass = 'fas fa-file-pdf';
|
||||
}
|
||||
}
|
||||
|
||||
elem.innerHTML = `
|
||||
<div class="file-icon">
|
||||
<i class="${iconClass}"></i>
|
||||
</div>
|
||||
<div class="file-name">${item.name}</div>
|
||||
`;
|
||||
|
||||
// Make draggable
|
||||
elem.setAttribute('draggable', 'true');
|
||||
elem.addEventListener('dragstart', (e) => {
|
||||
e.dataTransfer.setData('text/plain', item.id);
|
||||
elem.classList.add('dragging');
|
||||
});
|
||||
|
||||
elem.addEventListener('dragend', () => {
|
||||
elem.classList.remove('dragging');
|
||||
document.querySelectorAll('.drop-target').forEach(el => {
|
||||
el.classList.remove('drop-target');
|
||||
});
|
||||
});
|
||||
|
||||
// Click event (download)
|
||||
elem.addEventListener('click', () => {
|
||||
window.location.href = `/api/files/${item.id}`;
|
||||
});
|
||||
}
|
||||
|
||||
// Add to intersection observer (for future thumbnail support)
|
||||
this.observer.observe(elem);
|
||||
|
||||
return elem;
|
||||
}
|
||||
|
||||
/**
|
||||
* Render a single list item (file or folder)
|
||||
*/
|
||||
renderListItem(item, index) {
|
||||
const elem = document.createElement('div');
|
||||
elem.className = 'file-item';
|
||||
elem.dataset.index = index;
|
||||
|
||||
const isFolder = 'parent_id' in item; // Folders have parent_id
|
||||
|
||||
if (isFolder) {
|
||||
elem.dataset.folderId = item.id;
|
||||
elem.dataset.folderName = item.name;
|
||||
elem.dataset.parentId = item.parent_id || "";
|
||||
|
||||
// Format date
|
||||
const modifiedDate = new Date(item.modified_at * 1000);
|
||||
const formattedDate = modifiedDate.toLocaleDateString() + ' ' +
|
||||
modifiedDate.toLocaleTimeString([], {hour: '2-digit', minute:'2-digit'});
|
||||
|
||||
elem.innerHTML = `
|
||||
<div class="name-cell">
|
||||
<div class="file-icon folder-icon">
|
||||
<i class="fas fa-folder"></i>
|
||||
</div>
|
||||
<span>${item.name}</span>
|
||||
</div>
|
||||
<div>${this.i18n.t('files.file_types.folder')}</div>
|
||||
<div>--</div>
|
||||
<div>${formattedDate}</div>
|
||||
`;
|
||||
|
||||
// Make draggable
|
||||
if (item.parent_id) {
|
||||
elem.setAttribute('draggable', 'true');
|
||||
elem.addEventListener('dragstart', (e) => {
|
||||
e.dataTransfer.setData('text/plain', item.id);
|
||||
e.dataTransfer.setData('application/oxicloud-folder', 'true');
|
||||
elem.classList.add('dragging');
|
||||
});
|
||||
|
||||
elem.addEventListener('dragend', () => {
|
||||
elem.classList.remove('dragging');
|
||||
document.querySelectorAll('.drop-target').forEach(el => {
|
||||
el.classList.remove('drop-target');
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
// Click event
|
||||
elem.addEventListener('click', () => {
|
||||
if (typeof window.selectFolder === 'function') {
|
||||
window.selectFolder(item.id, item.name);
|
||||
}
|
||||
});
|
||||
|
||||
} else {
|
||||
// File item
|
||||
elem.dataset.fileId = item.id;
|
||||
elem.dataset.fileName = item.name;
|
||||
elem.dataset.folderId = item.folder_id || "";
|
||||
|
||||
// Determine file type label based on MIME type
|
||||
let typeLabel = this.i18n.t('files.file_types.document');
|
||||
let iconClass = 'fas fa-file';
|
||||
|
||||
if (item.mime_type) {
|
||||
if (item.mime_type.startsWith('image/')) {
|
||||
iconClass = 'fas fa-file-image';
|
||||
typeLabel = this.i18n.t('files.file_types.image');
|
||||
} else if (item.mime_type.startsWith('text/')) {
|
||||
iconClass = 'fas fa-file-alt';
|
||||
typeLabel = this.i18n.t('files.file_types.text');
|
||||
} else if (item.mime_type.startsWith('video/')) {
|
||||
iconClass = 'fas fa-file-video';
|
||||
typeLabel = this.i18n.t('files.file_types.video');
|
||||
} else if (item.mime_type.startsWith('audio/')) {
|
||||
iconClass = 'fas fa-file-audio';
|
||||
typeLabel = this.i18n.t('files.file_types.audio');
|
||||
} else if (item.mime_type === 'application/pdf') {
|
||||
iconClass = 'fas fa-file-pdf';
|
||||
typeLabel = this.i18n.t('files.file_types.pdf');
|
||||
}
|
||||
}
|
||||
|
||||
// Format file size
|
||||
const fileSize = this.formatFileSize(item.size);
|
||||
|
||||
// Format date
|
||||
const modifiedDate = new Date(item.modified_at * 1000);
|
||||
const formattedDate = modifiedDate.toLocaleDateString() + ' ' +
|
||||
modifiedDate.toLocaleTimeString([], {hour: '2-digit', minute:'2-digit'});
|
||||
|
||||
elem.innerHTML = `
|
||||
<div class="name-cell">
|
||||
<div class="file-icon">
|
||||
<i class="${iconClass}"></i>
|
||||
</div>
|
||||
<span>${item.name}</span>
|
||||
</div>
|
||||
<div>${typeLabel}</div>
|
||||
<div>${fileSize}</div>
|
||||
<div>${formattedDate}</div>
|
||||
`;
|
||||
|
||||
// Make draggable
|
||||
elem.setAttribute('draggable', 'true');
|
||||
elem.addEventListener('dragstart', (e) => {
|
||||
e.dataTransfer.setData('text/plain', item.id);
|
||||
elem.classList.add('dragging');
|
||||
});
|
||||
|
||||
elem.addEventListener('dragend', () => {
|
||||
elem.classList.remove('dragging');
|
||||
document.querySelectorAll('.drop-target').forEach(el => {
|
||||
el.classList.remove('drop-target');
|
||||
});
|
||||
});
|
||||
|
||||
// Click event (download)
|
||||
elem.addEventListener('click', () => {
|
||||
window.location.href = `/api/files/${item.id}`;
|
||||
});
|
||||
}
|
||||
|
||||
return elem;
|
||||
}
|
||||
|
||||
/**
|
||||
* Format file size in human-readable format
|
||||
*/
|
||||
formatFileSize(bytes) {
|
||||
if (bytes === 0) return '0 Bytes';
|
||||
|
||||
const k = 1024;
|
||||
const sizes = ['Bytes', 'KB', 'MB', 'GB', 'TB'];
|
||||
const i = Math.floor(Math.log(bytes) / Math.log(k));
|
||||
|
||||
return parseFloat((bytes / Math.pow(k, i)).toFixed(2)) + ' ' + sizes[i];
|
||||
}
|
||||
|
||||
/**
|
||||
* Set current view mode (grid or list)
|
||||
*/
|
||||
setView(view) {
|
||||
this.currentView = view;
|
||||
this.updateVisibleItems();
|
||||
}
|
||||
|
||||
/**
|
||||
* Load and render files and folders
|
||||
*/
|
||||
loadData(folders, files) {
|
||||
this.folders = folders || [];
|
||||
this.files = files || [];
|
||||
|
||||
// Reset containers
|
||||
if (this.gridContainer) {
|
||||
this.gridContainer.innerHTML = '';
|
||||
}
|
||||
|
||||
if (this.listContainer) {
|
||||
// Preserve the header
|
||||
const header = this.listContainer.querySelector('.list-header');
|
||||
this.listContainer.innerHTML = '';
|
||||
if (header) {
|
||||
this.listContainer.appendChild(header);
|
||||
}
|
||||
}
|
||||
|
||||
this.visibleItems = {};
|
||||
this.updateVisibleItems();
|
||||
}
|
||||
}
|
||||
|
||||
// Create the file renderer when the DOM is ready
|
||||
document.addEventListener('DOMContentLoaded', () => {
|
||||
window.fileRenderer = new FileRenderer();
|
||||
|
||||
// Expose the selectFolder function for navigation
|
||||
window.selectFolder = (id, name) => {
|
||||
// Update current path
|
||||
window.currentPath = id;
|
||||
|
||||
// Update breadcrumb
|
||||
const breadcrumb = document.querySelector('.breadcrumb');
|
||||
if (breadcrumb) {
|
||||
const home = breadcrumb.querySelector('.breadcrumb-item');
|
||||
breadcrumb.innerHTML = '';
|
||||
|
||||
if (home) {
|
||||
breadcrumb.appendChild(home);
|
||||
|
||||
if (name) {
|
||||
const separator = document.createElement('span');
|
||||
separator.className = 'breadcrumb-separator';
|
||||
separator.textContent = '>';
|
||||
breadcrumb.appendChild(separator);
|
||||
|
||||
const folderItem = document.createElement('span');
|
||||
folderItem.className = 'breadcrumb-item';
|
||||
folderItem.textContent = name;
|
||||
breadcrumb.appendChild(folderItem);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Load files for this folder
|
||||
window.loadFiles();
|
||||
};
|
||||
});
|
240
static/js/i18n.js
Normal file
240
static/js/i18n.js
Normal file
@@ -0,0 +1,240 @@
|
||||
/**
|
||||
* OxiCloud Internationalization (i18n) Module
|
||||
*
|
||||
* This module provides functionality for internationalization of the OxiCloud web interface.
|
||||
* It loads translations from the server and provides functions to translate keys.
|
||||
*/
|
||||
|
||||
// Current locale code (default to browser locale if available, fallback to English)
|
||||
let currentLocale =
|
||||
(navigator.language && navigator.language.substring(0, 2)) ||
|
||||
(navigator.userLanguage && navigator.userLanguage.substring(0, 2)) ||
|
||||
'en';
|
||||
|
||||
// Supported locales
|
||||
const supportedLocales = ['en', 'es'];
|
||||
|
||||
// Fallback to English if locale is not supported
|
||||
if (!supportedLocales.includes(currentLocale)) {
|
||||
currentLocale = 'en';
|
||||
}
|
||||
|
||||
// Cache for translations
|
||||
const translations = {};
|
||||
|
||||
/**
|
||||
* Load translations for a specific locale
|
||||
* @param {string} locale - The locale code to load (e.g., 'en', 'es')
|
||||
* @returns {Promise<object>} - A promise that resolves to the translations object
|
||||
*/
|
||||
async function loadTranslations(locale) {
|
||||
// Check if already loaded
|
||||
if (translations[locale]) {
|
||||
return translations[locale];
|
||||
}
|
||||
|
||||
try {
|
||||
const response = await fetch(`/api/i18n/locales/${locale}`);
|
||||
if (!response.ok) {
|
||||
throw new Error(`Failed to load translations for ${locale}`);
|
||||
}
|
||||
|
||||
// Fetch the actual JSON file directly if the API doesn't provide a full translations object
|
||||
const localeData = await fetch(`/locales/${locale}.json`);
|
||||
if (!localeData.ok) {
|
||||
throw new Error(`Failed to load locale file for ${locale}`);
|
||||
}
|
||||
|
||||
translations[locale] = await localeData.json();
|
||||
return translations[locale];
|
||||
} catch (error) {
|
||||
console.error('Error loading translations:', error);
|
||||
|
||||
// Try to load from file directly as fallback
|
||||
try {
|
||||
const fallbackResponse = await fetch(`/locales/${locale}.json`);
|
||||
if (fallbackResponse.ok) {
|
||||
translations[locale] = await fallbackResponse.json();
|
||||
return translations[locale];
|
||||
}
|
||||
} catch (fallbackError) {
|
||||
console.error('Error loading fallback translations:', fallbackError);
|
||||
}
|
||||
|
||||
// Return empty object as last resort
|
||||
translations[locale] = {};
|
||||
return translations[locale];
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get a nested translation value
|
||||
* @param {object} obj - The translations object
|
||||
* @param {string} path - The dot-notation path to the translation
|
||||
* @returns {string|null} - The translation value or null if not found
|
||||
*/
|
||||
function getNestedValue(obj, path) {
|
||||
const keys = path.split('.');
|
||||
let current = obj;
|
||||
|
||||
for (const key of keys) {
|
||||
if (current && typeof current === 'object' && key in current) {
|
||||
current = current[key];
|
||||
} else {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
return typeof current === 'string' ? current : null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Translate a key to the current locale
|
||||
* @param {string} key - The translation key (dot notation, e.g., 'app.title')
|
||||
* @param {object} params - Parameters to replace in the translation (e.g., {name: 'John'})
|
||||
* @returns {string} - The translated string or the key itself if not found
|
||||
*/
|
||||
function t(key, params = {}) {
|
||||
// Get translation from cache
|
||||
const localeData = translations[currentLocale];
|
||||
if (!localeData) {
|
||||
// Translation not loaded yet, return key
|
||||
console.warn(`Translations for ${currentLocale} not loaded yet`);
|
||||
return key;
|
||||
}
|
||||
|
||||
// Get the translation value
|
||||
const value = getNestedValue(localeData, key);
|
||||
if (!value) {
|
||||
// Try fallback to English
|
||||
if (currentLocale !== 'en' && translations['en']) {
|
||||
const fallbackValue = getNestedValue(translations['en'], key);
|
||||
if (fallbackValue) {
|
||||
return interpolate(fallbackValue, params);
|
||||
}
|
||||
}
|
||||
|
||||
// Key not found, return key
|
||||
console.warn(`Translation key not found: ${key}`);
|
||||
return key;
|
||||
}
|
||||
|
||||
// Replace parameters
|
||||
return interpolate(value, params);
|
||||
}
|
||||
|
||||
/**
|
||||
* Replace parameters in a translation string
|
||||
* @param {string} text - The translation string with placeholders
|
||||
* @param {object} params - The parameters to replace
|
||||
* @returns {string} - The interpolated string
|
||||
*/
|
||||
function interpolate(text, params) {
|
||||
return text.replace(/{{\s*([^}]+)\s*}}/g, (_, key) => {
|
||||
return params[key.trim()] !== undefined ? params[key.trim()] : `{{${key}}}`;
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Change the current locale
|
||||
* @param {string} locale - The locale code to switch to
|
||||
* @returns {Promise<boolean>} - A promise that resolves to true if successful
|
||||
*/
|
||||
async function setLocale(locale) {
|
||||
if (!supportedLocales.includes(locale)) {
|
||||
console.error(`Locale not supported: ${locale}`);
|
||||
return false;
|
||||
}
|
||||
|
||||
// Load translations if not loaded yet
|
||||
if (!translations[locale]) {
|
||||
await loadTranslations(locale);
|
||||
}
|
||||
|
||||
// Update current locale
|
||||
currentLocale = locale;
|
||||
|
||||
// Save locale preference
|
||||
localStorage.setItem('oxicloud-locale', locale);
|
||||
|
||||
// Trigger an event for components to update
|
||||
window.dispatchEvent(new CustomEvent('localeChanged', { detail: { locale } }));
|
||||
|
||||
// Update all elements with data-i18n attribute
|
||||
translatePage();
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
/**
|
||||
* Initialize the i18n system
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async function initI18n() {
|
||||
// Load saved locale preference
|
||||
const savedLocale = localStorage.getItem('oxicloud-locale');
|
||||
if (savedLocale && supportedLocales.includes(savedLocale)) {
|
||||
currentLocale = savedLocale;
|
||||
}
|
||||
|
||||
// Load translations for current locale
|
||||
await loadTranslations(currentLocale);
|
||||
|
||||
// Preload English translations as fallback
|
||||
if (currentLocale !== 'en') {
|
||||
await loadTranslations('en');
|
||||
}
|
||||
|
||||
// Translate the page
|
||||
translatePage();
|
||||
|
||||
console.log(`I18n initialized with locale: ${currentLocale}`);
|
||||
}
|
||||
|
||||
/**
|
||||
* Translate all elements with data-i18n attribute
|
||||
*/
|
||||
function translatePage() {
|
||||
document.querySelectorAll('[data-i18n]').forEach(element => {
|
||||
const key = element.getAttribute('data-i18n');
|
||||
element.textContent = t(key);
|
||||
});
|
||||
|
||||
document.querySelectorAll('[data-i18n-placeholder]').forEach(element => {
|
||||
const key = element.getAttribute('data-i18n-placeholder');
|
||||
element.placeholder = t(key);
|
||||
});
|
||||
|
||||
document.querySelectorAll('[data-i18n-title]').forEach(element => {
|
||||
const key = element.getAttribute('data-i18n-title');
|
||||
element.title = t(key);
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Get current locale
|
||||
* @returns {string} - The current locale code
|
||||
*/
|
||||
function getCurrentLocale() {
|
||||
return currentLocale;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get list of supported locales
|
||||
* @returns {Array<string>} - Array of supported locale codes
|
||||
*/
|
||||
function getSupportedLocales() {
|
||||
return [...supportedLocales];
|
||||
}
|
||||
|
||||
// Initialize when DOM is ready
|
||||
document.addEventListener('DOMContentLoaded', initI18n);
|
||||
|
||||
// Export functions for use in other modules
|
||||
window.i18n = {
|
||||
t,
|
||||
setLocale,
|
||||
getCurrentLocale,
|
||||
getSupportedLocales,
|
||||
translatePage
|
||||
};
|
66
static/js/languageSelector.js
Normal file
66
static/js/languageSelector.js
Normal file
@@ -0,0 +1,66 @@
|
||||
/**
|
||||
* Language Selector Component for OxiCloud
|
||||
*/
|
||||
|
||||
// Language codes and names
|
||||
const languages = [
|
||||
{ code: 'en', name: 'English' },
|
||||
{ code: 'es', name: 'Español' }
|
||||
];
|
||||
|
||||
/**
|
||||
* Creates and initializes a language selector component
|
||||
* @param {string} containerId - ID of the container element
|
||||
*/
|
||||
function createLanguageSelector(containerId = 'language-selector') {
|
||||
// Get or create container
|
||||
let container = document.getElementById(containerId);
|
||||
if (!container) {
|
||||
console.warn(`Container with ID "${containerId}" not found, creating one.`);
|
||||
container = document.createElement('div');
|
||||
container.id = containerId;
|
||||
document.body.appendChild(container);
|
||||
}
|
||||
|
||||
// Create dropdown
|
||||
const select = document.createElement('select');
|
||||
select.className = 'language-select';
|
||||
select.setAttribute('aria-label', 'Select language');
|
||||
|
||||
// Add options
|
||||
languages.forEach(lang => {
|
||||
const option = document.createElement('option');
|
||||
option.value = lang.code;
|
||||
option.textContent = lang.name;
|
||||
select.appendChild(option);
|
||||
});
|
||||
|
||||
// Set current language
|
||||
const currentLocale = window.i18n ? window.i18n.getCurrentLocale() : 'en';
|
||||
select.value = currentLocale;
|
||||
|
||||
// Add change event
|
||||
select.addEventListener('change', async (e) => {
|
||||
const locale = e.target.value;
|
||||
if (window.i18n) {
|
||||
await window.i18n.setLocale(locale);
|
||||
}
|
||||
});
|
||||
|
||||
// Add to container
|
||||
container.innerHTML = '';
|
||||
container.appendChild(select);
|
||||
|
||||
// Add event listener for locale changes
|
||||
window.addEventListener('localeChanged', (e) => {
|
||||
select.value = e.detail.locale;
|
||||
});
|
||||
|
||||
return container;
|
||||
}
|
||||
|
||||
// Create language selector when DOM is ready
|
||||
document.addEventListener('DOMContentLoaded', () => {
|
||||
// Create language selector
|
||||
createLanguageSelector();
|
||||
});
|
70
static/locales/en.json
Normal file
70
static/locales/en.json
Normal file
@@ -0,0 +1,70 @@
|
||||
{
|
||||
"app": {
|
||||
"title": "OxiCloud",
|
||||
"description": "Minimalist cloud storage system"
|
||||
},
|
||||
"nav": {
|
||||
"files": "Files",
|
||||
"shared": "Shared",
|
||||
"recent": "Recent",
|
||||
"favorites": "Favorites",
|
||||
"trash": "Trash"
|
||||
},
|
||||
"actions": {
|
||||
"search": "Search files...",
|
||||
"new_folder": "New folder",
|
||||
"upload": "Upload",
|
||||
"rename": "Rename",
|
||||
"move": "Move to...",
|
||||
"move_to": "Move to",
|
||||
"delete": "Delete",
|
||||
"cancel": "Cancel",
|
||||
"confirm": "Confirm"
|
||||
},
|
||||
"files": {
|
||||
"name": "Name",
|
||||
"type": "Type",
|
||||
"size": "Size",
|
||||
"modified": "Modified",
|
||||
"no_files": "No files in this folder",
|
||||
"view_grid": "Grid view",
|
||||
"view_list": "List view",
|
||||
"file_types": {
|
||||
"document": "Document",
|
||||
"image": "Image",
|
||||
"video": "Video",
|
||||
"audio": "Audio",
|
||||
"pdf": "PDF",
|
||||
"text": "Text",
|
||||
"folder": "Folder"
|
||||
}
|
||||
},
|
||||
"dialogs": {
|
||||
"rename_folder": "Rename folder",
|
||||
"new_name": "New name",
|
||||
"move_file": "Move file",
|
||||
"select_destination": "Select destination folder",
|
||||
"root": "Root",
|
||||
"delete_confirmation": "Are you sure you want to delete",
|
||||
"and_contents": "and all its contents",
|
||||
"no_undo": "This action cannot be undone"
|
||||
},
|
||||
"dropzone": {
|
||||
"drag_files": "Drag files here or click to select",
|
||||
"drop_files": "Drop files to upload"
|
||||
},
|
||||
"errors": {
|
||||
"file_not_found": "File not found",
|
||||
"folder_not_found": "Folder not found",
|
||||
"delete_error": "Error deleting",
|
||||
"upload_error": "Error uploading file",
|
||||
"rename_error": "Error renaming",
|
||||
"move_error": "Error moving",
|
||||
"empty_name": "Name cannot be empty",
|
||||
"name_exists": "A file or folder with that name already exists",
|
||||
"generic_error": "An error has occurred"
|
||||
},
|
||||
"breadcrumb": {
|
||||
"home": "Home"
|
||||
}
|
||||
}
|
70
static/locales/es.json
Normal file
70
static/locales/es.json
Normal file
@@ -0,0 +1,70 @@
|
||||
{
|
||||
"app": {
|
||||
"title": "OxiCloud",
|
||||
"description": "Sistema de almacenamiento en la nube minimalista"
|
||||
},
|
||||
"nav": {
|
||||
"files": "Archivos",
|
||||
"shared": "Compartidos",
|
||||
"recent": "Recientes",
|
||||
"favorites": "Favoritos",
|
||||
"trash": "Papelera"
|
||||
},
|
||||
"actions": {
|
||||
"search": "Buscar archivos...",
|
||||
"new_folder": "Nueva carpeta",
|
||||
"upload": "Subir",
|
||||
"rename": "Renombrar",
|
||||
"move": "Mover a...",
|
||||
"move_to": "Mover a",
|
||||
"delete": "Eliminar",
|
||||
"cancel": "Cancelar",
|
||||
"confirm": "Confirmar"
|
||||
},
|
||||
"files": {
|
||||
"name": "Nombre",
|
||||
"type": "Tipo",
|
||||
"size": "Tamaño",
|
||||
"modified": "Modificado",
|
||||
"no_files": "No hay archivos en esta carpeta",
|
||||
"view_grid": "Vista de cuadrícula",
|
||||
"view_list": "Vista de lista",
|
||||
"file_types": {
|
||||
"document": "Documento",
|
||||
"image": "Imagen",
|
||||
"video": "Video",
|
||||
"audio": "Audio",
|
||||
"pdf": "PDF",
|
||||
"text": "Texto",
|
||||
"folder": "Carpeta"
|
||||
}
|
||||
},
|
||||
"dialogs": {
|
||||
"rename_folder": "Renombrar carpeta",
|
||||
"new_name": "Nuevo nombre",
|
||||
"move_file": "Mover archivo",
|
||||
"select_destination": "Selecciona la carpeta destino",
|
||||
"root": "Raíz",
|
||||
"delete_confirmation": "¿Estás seguro de que quieres eliminar",
|
||||
"and_contents": "y todo su contenido",
|
||||
"no_undo": "Esta acción no se puede deshacer"
|
||||
},
|
||||
"dropzone": {
|
||||
"drag_files": "Arrastra archivos aquí o haz clic para seleccionar",
|
||||
"drop_files": "Suelta los archivos para subirlos"
|
||||
},
|
||||
"errors": {
|
||||
"file_not_found": "Archivo no encontrado",
|
||||
"folder_not_found": "Carpeta no encontrada",
|
||||
"delete_error": "Error al eliminar",
|
||||
"upload_error": "Error al subir el archivo",
|
||||
"rename_error": "Error al renombrar",
|
||||
"move_error": "Error al mover",
|
||||
"empty_name": "El nombre no puede estar vacío",
|
||||
"name_exists": "Ya existe un archivo o carpeta con ese nombre",
|
||||
"generic_error": "Ha ocurrido un error"
|
||||
},
|
||||
"breadcrumb": {
|
||||
"home": "Inicio"
|
||||
}
|
||||
}
|
10
static/oxicloud-logo.svg
Normal file
10
static/oxicloud-logo.svg
Normal file
@@ -0,0 +1,10 @@
|
||||
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 500 500">
|
||||
<!-- Fondo circular -->
|
||||
<circle cx="250" cy="250" r="200" fill="#f8f9fa" />
|
||||
|
||||
<!-- Forma de nube principal - inspirada en Nextcloud pero con estilo propio -->
|
||||
<path d="M330 280c27.6 0 50-22.4 50-50s-22.4-50-50-50c-5.3 0-10.3 0.8-15.1 2.3C307.4 154.5 282.6 135 253 135c-29.6 0-54.4 19.5-62.9 46.3C185.3 180.8 180.3 180 175 180c-27.6 0-50 22.4-50 50s22.4 50 50 50h155z" fill="#ff5e3a" />
|
||||
|
||||
<!-- Texto del logotipo - solo el nombre -->
|
||||
<text x="250" y="350" font-family="Arial, sans-serif" font-size="55" font-weight="bold" text-anchor="middle" fill="#2b3a4a">OxiCloud</text>
|
||||
</svg>
|
After Width: | Height: | Size: 657 B |
97
static/sw.js
Normal file
97
static/sw.js
Normal file
@@ -0,0 +1,97 @@
|
||||
// OxiCloud Service Worker
|
||||
const CACHE_NAME = 'oxicloud-cache-v1';
|
||||
const ASSETS_TO_CACHE = [
|
||||
'/',
|
||||
'/index.html',
|
||||
'/js/i18n.js',
|
||||
'/js/languageSelector.js',
|
||||
'/locales/en.json',
|
||||
'/locales/es.json',
|
||||
'/favicon.ico',
|
||||
'https://cdnjs.cloudflare.com/ajax/libs/font-awesome/6.0.0-beta3/css/all.min.css',
|
||||
'https://cdn.jsdelivr.net/npm/alpinejs@3.12.3/dist/cdn.min.js'
|
||||
];
|
||||
|
||||
// Install event - cache assets
|
||||
self.addEventListener('install', event => {
|
||||
event.waitUntil(
|
||||
caches.open(CACHE_NAME)
|
||||
.then(cache => {
|
||||
console.log('Cache opened');
|
||||
return cache.addAll(ASSETS_TO_CACHE);
|
||||
})
|
||||
.then(() => self.skipWaiting()) // Activate immediately
|
||||
);
|
||||
});
|
||||
|
||||
// Activate event - clean old caches
|
||||
self.addEventListener('activate', event => {
|
||||
event.waitUntil(
|
||||
caches.keys().then(cacheNames => {
|
||||
return Promise.all(
|
||||
cacheNames.filter(cacheName => {
|
||||
return cacheName !== CACHE_NAME;
|
||||
}).map(cacheName => {
|
||||
return caches.delete(cacheName);
|
||||
})
|
||||
);
|
||||
}).then(() => self.clients.claim()) // Take control of clients
|
||||
);
|
||||
});
|
||||
|
||||
// Fetch event - serve from cache, update cache from network
|
||||
self.addEventListener('fetch', event => {
|
||||
// Don't intercept API requests - let them go straight to the network
|
||||
if (event.request.url.includes('/api/')) {
|
||||
return;
|
||||
}
|
||||
|
||||
event.respondWith(
|
||||
caches.match(event.request)
|
||||
.then(response => {
|
||||
// Cache hit - return the response from the cached version
|
||||
if (response) {
|
||||
// For non-core assets, still fetch from network for updates
|
||||
if (!ASSETS_TO_CACHE.includes(new URL(event.request.url).pathname)) {
|
||||
fetch(event.request).then(networkResponse => {
|
||||
if (networkResponse && networkResponse.status === 200) {
|
||||
const clonedResponse = networkResponse.clone();
|
||||
caches.open(CACHE_NAME).then(cache => {
|
||||
cache.put(event.request, clonedResponse);
|
||||
});
|
||||
}
|
||||
}).catch(() => {
|
||||
// Ignore network fetch errors - we already have a cached version
|
||||
});
|
||||
}
|
||||
return response;
|
||||
}
|
||||
|
||||
// Not in cache - get from network and add to cache
|
||||
return fetch(event.request).then(response => {
|
||||
if (!response || response.status !== 200 || response.type !== 'basic') {
|
||||
return response;
|
||||
}
|
||||
|
||||
// Clone the response as it's a stream and can only be consumed once
|
||||
const responseToCache = response.clone();
|
||||
|
||||
caches.open(CACHE_NAME).then(cache => {
|
||||
cache.put(event.request, responseToCache);
|
||||
});
|
||||
|
||||
return response;
|
||||
});
|
||||
})
|
||||
);
|
||||
});
|
||||
|
||||
// Background sync for failed requests
|
||||
self.addEventListener('sync', event => {
|
||||
if (event.tag === 'oxicloud-sync') {
|
||||
event.waitUntil(
|
||||
// Implement background sync for pending file operations
|
||||
Promise.resolve() // Placeholder for actual implementation
|
||||
);
|
||||
}
|
||||
});
|
9
storage/file_ids.json
Normal file
9
storage/file_ids.json
Normal file
@@ -0,0 +1,9 @@
|
||||
{
|
||||
"path_to_id": {
|
||||
"folder_ids.json": "94fa2cbf-e9f3-4811-8071-008e04ef9518",
|
||||
"oxicloud-logo.svg": "5348bfcb-f615-4616-8a45-bbdd26430a3c",
|
||||
"prueba3/oxicloud-logo.svg": "c59bec72-9c70-4ff2-941f-543d79714f3c",
|
||||
"file_ids.json": "b5b94ec6-84e6-4f14-b83a-54e6752f64a6",
|
||||
"Manual de urgencias_vf.pdf": "a0c96f4b-9fd3-49ee-8aa0-fa744b04d6aa"
|
||||
}
|
||||
}
|
BIN
storage/prueba3/NIST.SP.500-322 (3).pdf
Normal file
BIN
storage/prueba3/NIST.SP.500-322 (3).pdf
Normal file
Binary file not shown.
BIN
storage/prueba3/caffeine64.exe
Normal file
BIN
storage/prueba3/caffeine64.exe
Normal file
Binary file not shown.
7
storage/prueba3/folder_ids.json
Normal file
7
storage/prueba3/folder_ids.json
Normal file
@@ -0,0 +1,7 @@
|
||||
{
|
||||
"path_to_id": {
|
||||
"prueba5": "58a82b8c-ef9d-43cf-b593-4668fdb0ef03",
|
||||
"prueba4": "248abd28-e94e-4c75-b1b6-712d9957d850",
|
||||
"prueba3": "30dd94f1-86c6-4b9d-bc9a-d51686f0dbcd"
|
||||
}
|
||||
}
|
10
storage/prueba3/oxicloud-logo.svg
Normal file
10
storage/prueba3/oxicloud-logo.svg
Normal file
@@ -0,0 +1,10 @@
|
||||
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 500 500">
|
||||
<!-- Fondo circular -->
|
||||
<circle cx="250" cy="250" r="200" fill="#f8f9fa" />
|
||||
|
||||
<!-- Forma de nube principal - inspirada en Nextcloud pero con estilo propio -->
|
||||
<path d="M330 280c27.6 0 50-22.4 50-50s-22.4-50-50-50c-5.3 0-10.3 0.8-15.1 2.3C307.4 154.5 282.6 135 253 135c-29.6 0-54.4 19.5-62.9 46.3C185.3 180.8 180.3 180 175 180c-27.6 0-50 22.4-50 50s22.4 50 50 50h155z" fill="#ff5e3a" />
|
||||
|
||||
<!-- Texto del logotipo - solo el nombre -->
|
||||
<text x="250" y="350" font-family="Arial, sans-serif" font-size="55" font-weight="bold" text-anchor="middle" fill="#2b3a4a">OxiCloud</text>
|
||||
</svg>
|
After Width: | Height: | Size: 657 B |
BIN
storage/prueba4/NIST.SP.500-322 (3).pdf
Normal file
BIN
storage/prueba4/NIST.SP.500-322 (3).pdf
Normal file
Binary file not shown.
11
storage/prueba4/file_ids.json
Normal file
11
storage/prueba4/file_ids.json
Normal file
@@ -0,0 +1,11 @@
|
||||
{
|
||||
"path_to_id": {
|
||||
"prueba5/NIST.SP.500-322.pdf": "50d9675e-617f-472d-a6e7-cdcf83dfadfe",
|
||||
"prueba3/NIST.SP.500-322.pdf": "a73a94bd-77e5-41ad-8723-9675fa15e66c",
|
||||
"file_ids.json": "bda2a482-6947-4922-8263-a3826d13ab11",
|
||||
"folder_ids.json": "dbd1c1e4-8441-40d6-8f43-e53a53abfde0",
|
||||
"prueba4/folder_ids.json": "d4b559da-bf7e-4754-99bc-970f9c7a64fc",
|
||||
"prueba3/NIST.SP.500-322 (3).pdf": "e71813d5-4234-4b0f-9c0a-55acb731ce7e",
|
||||
"NIST.SP.500-322 (3).pdf": "95f7039e-06ff-41b7-87a0-2d7af6cbe4fb"
|
||||
}
|
||||
}
|
7
storage/prueba4/folder_ids.json
Normal file
7
storage/prueba4/folder_ids.json
Normal file
@@ -0,0 +1,7 @@
|
||||
{
|
||||
"path_to_id": {
|
||||
"prueba4": "66992be8-56dd-4c69-b743-98cf9c484c39",
|
||||
"prueba5": "f09dfe30-f47a-4901-bf52-7214d601d24b",
|
||||
"prueba3": "66047168-e2d3-44d5-85e4-656ed8b2d803"
|
||||
}
|
||||
}
|
BIN
storage/prueba5/NIST.SP.500-322.pdf
Normal file
BIN
storage/prueba5/NIST.SP.500-322.pdf
Normal file
Binary file not shown.
Reference in New Issue
Block a user