Experience

LuckyCloud

Reverse Engineering Consultant

May 2022

Was tasked with reverse engineering the SeaDrive MacOs app in order to 'whitelabel' it. I ended up creating a python script that automatically patches the app and replaces any SeaDrive images/icons with the equivalent ones from LuckyCloud. This script was ran by a shell script to make it easier to use and I had to write some AppleScript in order to bundle back the App.

Technologies: Python, Ghidra, AppleScript

CryptoToolkit

Software Engineer

April 2022

Created a custom MetaMask fork where some special features were added to make coin/NFT snipping easy.

Technologies: Typescript, Javascript, React, Web Browser Extensions, web3, Github

Plotly

R Consultant

Oct - Nov 2022

As an R expert I was tasked with helping alleviate the burden on the main developer of plotly.R a data-viz library that's used by more than 11 million downloads. I was able to help resolve some issues in their github repo.

Technologies: R, ggplot, Plotly.R, Git, Github

GreekAdvisors

Software & Data Engineer

July 2021 - Present

I was tasked by the client to create scrapers for tradingview.com, deploying them to AWS and creating a GUI and a website from where he and his team will be able to manage their resources and watch the progress of the scrapers.

I leveraged boto3 and the SSH protocol in order to make it easy to provision EC2 instances, start them, manage the setup/update process, upload new files for the scrapers, get the status of each instance and some important stats (like Avg. time per line, EDT, progress ...).

Technologies: Python, React, Flask, uWSGI, NGINX, AWS, SSH, Selenium, boto3, MongoDB, SQLite

The Big Data Company

Software & Data Engineering Consultant

June 2021 - June 2022

As a Data/Software engineering consultant I was tasked with building multiple APIs to help TBDC provide accurate results and beautiful GIS maps.

During my tenure with TBDC, I managed to build a dockerized statistics API leveraging the power of R that was proxied by an ExpressJs server in order to allow for concurrent users. I also built a crawler that consumes the API of one of the Agriculture machinery manufacturor and creates chloropleths that are then saved in an S3 bucket and referenced back in database so the TDBC app could display those image in a Facebook 'news feed' style manner.

Technologies: R, Docker, Node, Express, AWS, S3 buckets, MySQL, GDAL, Google Maps API, Gitlab, OpenAPI, REST


Skills

Languages

JavascriptTypescriptPythonC#VB.netVBAC/C++Bashlua

Frameworks/Libraries

QtAnd so many more...

Tool/Applications

GitNeoVimFigmaGhidraMacOsWindowsLinux