Pagination
Navigate through large datasets efficiently
Cursor-Based Pagination
The Betalink API uses cursor-based pagination for efficient navigation through large datasets. This approach is more reliable than offset-based pagination when data changes frequently.
Pagination Parameters
All list endpoints accept these parameters:
| Parameter | Type | Default | Description |
|---|---|---|---|
limit | number | 50 | Number of items per page (1-100) |
after | string | - | Cursor for the next page |
before | string | - | Cursor for the previous page |
// Get first page with custom limit
const firstPage = await client.listCompanies({ limit: 25 });
// Get next page using cursor
const nextPage = await client.listCompanies({
limit: 25,
after: firstPage.data[firstPage.data.length - 1].id,
});Response Structure
Paginated responses include metadata and navigation links:
interface PaginatedResponse<T> {
data: T[];
meta: {
total: number; // Total number of items
limit: number; // Items per page
hasMore: boolean; // Whether more pages exist
};
_links: {
self: string; // Current page URL
next: string | null; // Next page URL
prev: string | null; // Previous page URL
};
}Example Response
{
"data": [
{
"id": "comp_abc123",
"cvr": "12345678",
"name": "Acme ApS"
}
],
"meta": {
"total": 150,
"limit": 50,
"hasMore": true
},
"_links": {
"self": "/api/v1/companies?limit=50",
"next": "/api/v1/companies?limit=50&after=comp_abc123",
"prev": null
}
}Iterating All Pages
Using hasMore
async function getAllCompanies() {
const allCompanies: Company[] = [];
let after: string | undefined;
do {
const response = await client.listCompanies({ limit: 100, after });
allCompanies.push(...response.data);
if (response.meta.hasMore && response.data.length > 0) {
after = response.data[response.data.length - 1].id;
} else {
break;
}
} while (true);
return allCompanies;
}Using Async Generator
async function* iterateCompanies(limit = 100) {
let after: string | undefined;
while (true) {
const response = await client.listCompanies({ limit, after });
for (const company of response.data) {
yield company;
}
if (!response.meta.hasMore) break;
after = response.data[response.data.length - 1].id;
}
}
// Usage
for await (const company of iterateCompanies()) {
console.log(company.name);
}Transactions Pagination
Transaction endpoints work the same way:
const transactions = await client.getCompanyTransactions("12345678", {
limit: 50,
});
if (transactions.meta.hasMore) {
const lastId = transactions.data[transactions.data.length - 1].id;
const nextPage = await client.getCompanyTransactions("12345678", {
limit: 50,
after: lastId,
});
}Best Practices
Choose an Appropriate Page Size
- Small pages (10-25): Better for UI pagination with quick responses
- Medium pages (50): Good default for most use cases
- Large pages (100): Best for bulk data processing
Handle Empty Pages
Always check if data is present before accessing cursors:
const response = await client.listCompanies({ limit: 50, after });
if (response.data.length === 0) {
console.log("No more data");
return;
}
const nextCursor = response.data[response.data.length - 1].id;Use HATEOAS Links
The _links object provides ready-to-use URLs:
const response = await client.listCompanies();
if (response._links.next) {
// Parse the next URL to extract the after parameter
const url = new URL(response._links.next, "https://betalink.dev");
const after = url.searchParams.get("after");
}Rate Limiting Considerations
When fetching many pages, implement delays to avoid rate limits:
async function getAllWithDelay() {
const allCompanies: Company[] = [];
let after: string | undefined;
while (true) {
const response = await client.listCompanies({ limit: 100, after });
allCompanies.push(...response.data);
if (!response.meta.hasMore) break;
after = response.data[response.data.length - 1].id;
// Add delay between requests
await new Promise((r) => setTimeout(r, 100));
}
return allCompanies;
}