Almacenamiento en la nube
Carga archivos directamente a proveedores de almacenamiento en la nube sin intermediar a través de tu servidor.
Amazon S3
Instalación
npm install @samithahansaka/dropup
Uso básico
import { useDropup } from '@samithahansaka/dropup';
import { createS3Uploader } from '@samithahansaka/dropup/cloud/s3';
function S3Uploader() {
const { files, actions, getDropProps, getInputProps } = useDropup({
upload: createS3Uploader({
getPresignedUrl: async (file) => {
// Llama a tu backend para obtener una URL prefirmada
const response = await fetch('/api/s3/presign', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
filename: file.name,
contentType: file.type,
}),
});
return response.json();
},
}),
onUploadComplete: (file) => {
console.log('Cargado a S3:', file.uploadedUrl);
},
});
return (
<div {...getDropProps()}>
<input {...getInputProps()} />
<p>Suelta archivos para cargar a S3</p>
</div>
);
}
Backend: Generar URL prefirmada
// Ejemplo Node.js / Express
import { S3Client, PutObjectCommand } from '@aws-sdk/client-s3';
import { getSignedUrl } from '@aws-sdk/s3-request-presigner';
const s3 = new S3Client({
region: process.env.AWS_REGION,
credentials: {
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
},
});
app.post('/api/s3/presign', async (req, res) => {
const { filename, contentType } = req.body;
const key = `uploads/${Date.now()}-${filename}`;
const command = new PutObjectCommand({
Bucket: process.env.S3_BUCKET,
Key: key,
ContentType: contentType,
});
const url = await getSignedUrl(s3, command, { expiresIn: 3600 });
res.json({
url,
fields: {}, // Para PUT simple, no se necesitan campos adicionales
});
});
S3 con POST (formulario multipart)
Para políticas POST de S3:
createS3Uploader({
getPresignedUrl: async (file) => {
const response = await fetch('/api/s3/presign-post', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
filename: file.name,
contentType: file.type,
}),
});
const { url, fields } = await response.json();
return {
url, // URL del bucket S3
fields, // Campos de política para incluir en el formulario
};
},
});
Google Cloud Storage
import { createGCSUploader } from '@samithahansaka/dropup/cloud/gcs';
function GCSUploader() {
const { files, getDropProps, getInputProps } = useDropup({
upload: createGCSUploader({
getSignedUrl: async (file) => {
const response = await fetch('/api/gcs/sign', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
filename: file.name,
contentType: file.type,
}),
});
return response.json();
},
}),
});
return (
<div {...getDropProps()}>
<input {...getInputProps()} />
<p>Cargar a Google Cloud Storage</p>
</div>
);
}
Backend: URL firmada de GCS
// Ejemplo Node.js
import { Storage } from '@google-cloud/storage';
const storage = new Storage();
const bucket = storage.bucket(process.env.GCS_BUCKET);
app.post('/api/gcs/sign', async (req, res) => {
const { filename, contentType } = req.body;
const blob = bucket.file(`uploads/${Date.now()}-${filename}`);
const [url] = await blob.getSignedUrl({
version: 'v4',
action: 'write',
expires: Date.now() + 15 * 60 * 1000, // 15 minutos
contentType,
});
res.json({ url });
});
Azure Blob Storage
import { createAzureUploader } from '@samithahansaka/dropup/cloud/azure';
function AzureUploader() {
const { files, getDropProps, getInputProps } = useDropup({
upload: createAzureUploader({
getSasUrl: async (file) => {
const response = await fetch('/api/azure/sas', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
filename: file.name,
contentType: file.type,
}),
});
return response.json();
},
}),
});
return (
<div {...getDropProps()}>
<input {...getInputProps()} />
<p>Cargar a Azure Blob Storage</p>
</div>
);
}
Backend: URL SAS de Azure
// Ejemplo Node.js
import {
BlobServiceClient,
generateBlobSASQueryParameters,
BlobSASPermissions,
} from '@azure/storage-blob';
const blobServiceClient = BlobServiceClient.fromConnectionString(
process.env.AZURE_STORAGE_CONNECTION_STRING
);
app.post('/api/azure/sas', async (req, res) => {
const { filename, contentType } = req.body;
const containerClient = blobServiceClient.getContainerClient('uploads');
const blobName = `${Date.now()}-${filename}`;
const blobClient = containerClient.getBlockBlobClient(blobName);
const sasToken = generateBlobSASQueryParameters(
{
containerName: 'uploads',
blobName,
permissions: BlobSASPermissions.parse('cw'), // Crear, Escribir
expiresOn: new Date(Date.now() + 15 * 60 * 1000),
},
blobServiceClient.credential
).toString();
res.json({
url: `${blobClient.url}?${sasToken}`,
headers: {
'x-ms-blob-type': 'BlockBlob',
'Content-Type': contentType,
},
});
});
Cloudflare R2
R2 es compatible con S3, por lo que usa el cargador de S3:
import { createS3Uploader } from '@samithahansaka/dropup/cloud/s3';
function R2Uploader() {
const { files, getDropProps, getInputProps } = useDropup({
upload: createS3Uploader({
getPresignedUrl: async (file) => {
const response = await fetch('/api/r2/presign', {
method: 'POST',
body: JSON.stringify({ filename: file.name }),
});
return response.json();
},
}),
});
return (
<div {...getDropProps()}>
<input {...getInputProps()} />
<p>Cargar a Cloudflare R2</p>
</div>
);
}
Backend: URL prefirmada de R2
import { S3Client, PutObjectCommand } from '@aws-sdk/client-s3';
import { getSignedUrl } from '@aws-sdk/s3-request-presigner';
const s3 = new S3Client({
region: 'auto',
endpoint: `https://${process.env.CF_ACCOUNT_ID}.r2.cloudflarestorage.com`,
credentials: {
accessKeyId: process.env.R2_ACCESS_KEY_ID,
secretAccessKey: process.env.R2_SECRET_ACCESS_KEY,
},
});
app.post('/api/r2/presign', async (req, res) => {
const { filename } = req.body;
const command = new PutObjectCommand({
Bucket: process.env.R2_BUCKET,
Key: `uploads/${Date.now()}-${filename}`,
});
const url = await getSignedUrl(s3, command, { expiresIn: 3600 });
res.json({ url });
});
DigitalOcean Spaces
También compatible con S3:
// Igual que S3, solo actualiza la configuración del endpoint en tu backend
const s3 = new S3Client({
region: 'nyc3',
endpoint: 'https://nyc3.digitaloceanspaces.com',
credentials: {
accessKeyId: process.env.DO_SPACES_KEY,
secretAccessKey: process.env.DO_SPACES_SECRET,
},
});
Proveedor de nube personalizado
Crea tu propio cargador para cualquier servicio en la nube:
import { useDropup, type CustomUploader } from '@samithahansaka/dropup';
const customCloudUploader: CustomUploader = async (file, options) => {
// 1. Obtener URL de carga desde tu backend
const { uploadUrl, fileUrl } = await fetch('/api/custom-cloud/init', {
method: 'POST',
body: JSON.stringify({ filename: file.name, size: file.size }),
}).then(r => r.json());
// 2. Cargar archivo
const xhr = new XMLHttpRequest();
return new Promise((resolve, reject) => {
xhr.upload.onprogress = (e) => {
if (e.lengthComputable) {
options.onProgress((e.loaded / e.total) * 100);
}
};
xhr.onload = () => {
if (xhr.status >= 200 && xhr.status < 300) {
resolve({ url: fileUrl });
} else {
reject(new Error('Carga fallida'));
}
};
xhr.onerror = () => reject(new Error('Error de red'));
// Manejar cancelación
options.signal.addEventListener('abort', () => xhr.abort());
xhr.open('PUT', uploadUrl);
xhr.send(file.file);
});
};
function CustomCloudUploader() {
const { files, getDropProps, getInputProps } = useDropup({
upload: customCloudUploader,
});
return (
<div {...getDropProps()}>
<input {...getInputProps()} />
<p>Cargar a nube personalizada</p>
</div>
);
}
Mejores prácticas de seguridad
- Nunca expongas credenciales en el cliente - Siempre genera URLs firmadas en tu backend
- Usa tiempos de expiración cortos - 5-15 minutos es usualmente suficiente
- Valida tipos de archivo en el backend - No confíes solo en validación del lado del cliente
- Configura políticas CORS apropiadas en tu almacenamiento en la nube
- Limita tamaños de archivo en tus políticas de URL prefirmada
- Usa buckets separados para cargas de usuarios vs. activos de la aplicación