About idleak.net
Indonesia leak, ransomware, and data breach monitoring

Prologue
Cyber security is always evolving. No matter which topic of cyber you are working with, there is always a need to know and learn new stuff, constantly. On one hand, this means a never ending fun, finding out new stuff and trying out stuff, but on the other hand, it is super annoying have to keep up with so many things.
In my field, especially, there is a need to be constantly aware of new threats, whether is a new leak, a new ransomware attack, or new CVE from a widely used software. But new stuff is not always threats, there are myriads of new ways of doing things, new cool software that can potentially be useful for you, edge cases, use cases, new detection techniques, this part feeds into the never ending fun of finding out and trying out stuff.
With the deluge of information that needs handling, I realized quickly that I need differentiate between these information. Having my own Eisnehower Matrix, sort of, is mandatory if I am to stay afloat in this cyber business. My current solution is a sort of knowledge base/ notes for my “fun tryout stuff” (detailed in the graph below).

While the collection for my “fun tryout stuff” is pretty much settled, the urgent/important part of my Eisenhower Matrix is actually the new threats, new attacks, and new CVE. I need to be in the know for all these things, especially if it relate closely to Indonesia market. Currently I have ntfy.sh to monitor ransomware, and a bunch of twitter/ mastodon/ bluesky feeds for CVE, leaks, and the likes. The main pain point is 80% of these information is not related to my use case. Sure, there might be a ransomware attack involving Indonesian entity, or data leak in a closely related industry with mine, but these are few and far between.
So the needs arise, for some sort of feeds that will inform me of incident that have high values to me (i.e. related to Indonesia). Earlier I was aiming for some sort of telegram/ discord bot as front end, and back end is a bunch of worker for different source (i.e. ransomware, CVE, leaks, etc). But in the end I opted for a dashboard as the front end instead. The idea behind it is it’ll be easier in the future to share it if needed, and it might even help someone right now, hence idleak.net.
Design and Development
Ideally I wanted the site to be able to track ransomware incident, cyber news, leaks from stealer all focusing in Indonesia market. My plan was to use python as back end to gather these data, put it in a database (after doing formatting and cleansing) and use a static file to serve the result. To achieve this I designed the development to be of at least three stages with incremental addition in the future. I aim for these features to be developed for the stages:
Phase 1: completed worker 01 (ransomware)
Phase 2: completed front end and
Phase 3: heartbeat function for worker 01
Phase 4: completed worker 02 (cyber news)
Phase 5: completed worker 03 (stealer)

Currently we are at phase 2, so one of the back end worker and front end is complete, with more workers in the pipeline. You can check out the project here, and the site idleak.net.
Back End
The project used python for back end (worker) with specific function for each worker. These functions are scraping telegram, twitter, rss feeds, API or ransomware board. Worker script hosted in AWS Lambda, with AWS built in role and permission. Back end also handles data cleansing and filtering, which varies based on the worker's role. Finally, a database check is done by connecting to supabase, to see if similar information is already exist, and post it if it’s not presently exist. Here are the more detailed description of functions inside the worker.
Initialization and Setup: The program first loads configuration settings (like the Supabase database URL, secret key, and the target country code) from environment variables. It sets up a logging mechanism to record its activities.
Connect to Database: It attempts to establish a secure connection to the Supabase database. If the connection fails, the process stops immediately and logs an error.
Fetch External Data: It makes a request to the ransomware.live API using the configured country code (e.g., "id" for Indonesia) to retrieve a list of recent ransomware victim reports.
Process and Filter by Time: The retrieved raw data is then processed. Any report older than a configured time limit (e.g., 60 hours) is discarded. The remaining recent reports are standardized into a clean format suitable for the database.
Check for Duplicates: To prevent storing the same information twice, the program queries the Supabase database to get a list of keys (title, publish date, and source URL) for all existing records. It then compares the new, processed reports against these existing keys, filtering out any entries already present in the database.
Insert New Records: The resulting list, containing only the unique, new victim reports, is then inserted into the Supabase table.
Final Report: Finally, the program logs how many unique records were successfully inserted and returns a status indicating the operation is complete.
From time to time I needed to check if my worker successfully updated the newest ransomware attack, or if the Lambda is dead. This is quite annoying as I will have to go look at Cloudwatch log for the Lambda. So I decided to add some sort of heartbeat function for the worker. This, however, will cause other features to be push the other to later phases.
Front End
The front end uses static files, and java script is used to pull from the database to sort and show database. CSS used purecss and grid.js used to serve tables. Here are the more detailed explanation of the front end.
ransom.htmlfile provides the necessary structure, or skeleton, for the webpage.
This is the basic web page. It builds the layout, including the header and a simple side menu. Most importantly, it creates an empty box where the data table will go and loads all the necessary external tools, like the database connector and the table builder.ransom.jsfile contains the application's core logic and data handling.
This is the part that does the work. It connects to the Supabase cloud database, pulls the list of ransomware incidents, and then uses a tool (Grid.js) to turn that data into a functional table. It then drops this completed, interactive table right into the empty box on the HTML page.ui.jsfile is a dedicated script for controlling the visual interaction of the page's layout.
This is a simple script that just handles the side menu. It makes sure that when you click the menu icon, the menu slides in and out correctly, keeping the website clean and easy to use.
Next phases will build up on these workflow, so similar html, js and css will be used, basically to serve different table from different back end worker.
Epilogue
The project arise from my need to a streamlined place to check for ransomware, breach, leaks that focuses in Indonesia market. A python worker back end collect required data and do clean up with Supabase database and html/css used to serve the information that can be accessed in idleak.net.



