Projects
Salesforce (2023 - Present)
Mulesoft (2023 - Present)
MuleSoft empowers seamless integration of data, systems, and AI models, securely automating tasks and processes across your entire organization, including legacy systems. It enables both developers and business users to build efficiently using intuitive clicks, powerful code, and AI-powered natural language prompts, ensuring rapid innovation and streamlined operations.
Product Link: https://www.mulesoft.com
Documentation: https://docs.mulesoft.com
Informatica (2011 - 2023)
Cloud Data Governance and Catalog (2020 - 2023)
The Informatica Cloud Data Governance and Catalog is a unified, cloud-native tool that enables customers to find, understand, govern, and trust their data. It brings together data cataloging, governance, quality, and democratization capabilities into a new, singular cloud-native solution for data intelligence.
Product Link: https://www.informatica.com/products/data-governance/cloud-data-governance-and-catalog.html
Enterprise Data Catalog (2016 - 2020)
A machine learning-based data catalog that lets you discover, enrich, classify and organize data assets across any environment to maximize data value and reuse, and provides a metadata system of record for the enterprise.
Data Catalog helps organizations find and govern data using an organized and enriched inventory of data assets across the enterprise. This is a self service solution where data consumers can easily discover, understand, track and govern the data assets spread across multiple source systems.
Key Features:
Scan and index metadata,
Discover and profile, and
Provide detailed lineage across tens of millions of data sets.
Provide a metadata system of record for the enterprise with a catalog of catalogs.
Environment: Java, Spring, Hadoop, HBase, Janus Graph, Apache Solr
Product Link: https://www.informatica.com/in/products/data-catalog.html
Enterprise Data Lake (2015)
Generic Connectivity SDK (2011 - 2014)
Connectivity Solution to MS Dynamics, Netezza, Salesforce. (2011 - 2014)
Teleonto Technologies Pvt. Ltd. (2006 - 2010)
CApTOR (A Revenue Assurance Product from Teleonto)
CApTOR™ is an integrated Revenue Assurance product that can be used to proactively detect and prevent revenue leakages that occur at various touch points in the collection systems
Environment: Lisp, C++, Java, ApT (An SOA framework in Lisp), Allegro Graph, Allegro Cache, PostgreSQL, GridSQL.
Key Features:
Appliance based Shared Nothing Architecture to support Massive Parallel Processing and easy scalability.
SOA based Layered Architectural Design.
User Traffic Analyzer
An analytic tool developed for Revenue Assurance Analyst to analyze the calls (Call Detail Records) of Telco's subscribers made over a period of time. Calls are processed and stored in graph database in an easily searchable/traversable manner.
Environment: Lisp, ApT, Allegro Graph.
Flexible and Customizable Rating Engine
Lead a team of four people in developing a Rating Engine which can be easily customizable to a Telco. Used the distributed rating architecture where a Rating Manager distributes the load across several Rating Servers.
Environment: Lisp, ApT (SOA framework), Lisa (Rule Engine).
CApTOR – Workflows
Workflows is the core functionality of the CApTOR based on service oriented architecture. It acts as intelligent middleware between GUI and core functional components (like FTP Pull, Mediation, DB Loading, Reconciliation and Rating) and distributes the work to multiple cores/cpus.
Environment: Lisp, Java, ApT (An SOA framework in Lisp), JavaScript, Jakarta Tomcat
Generic distributed Reconciliation Software
Developed a generic distributed reconciliation engine which can do a reconciliation of two CSV (comma-separated values) sources and produce a detailed classification report.
Environment: Lisp, C++, ApT (An SOA framework in Lisp), Allegro Cache (An object database)
DB Layer & Multi-dimensional Data Summary
As a member of a very small team, developed a Database Layer for CApTOR (Telecom Revenue Assurance Product) using an object oriented database (Allegro Cache) to load the full daily data (150 Million records) into Allegro Cache, create a multi-dimensional summary and a query interface to the summary and base data. One month of data (4.5 Billion CDRs) can be loaded and queried.
Environment: Lisp, ApT (An SOA framework in Lisp), Allegro Cache
Roaming Analytics using Ontology
Analyzes the behavior of In Roamers of a Telecom operator to provide quality of service and attract them. Modeled an ontology and developed a knowledge base using
OWL (Semantic Web) and a graph database (Allegro Graph)
Frame based (or Class Based) modeling and an object database (Allegro Cache)
Developed the whole system from scratch. Was able to show significant results within 2 weeks. Research for this project included Semantic Web, Clustering techniques, use of OWL, Racer, and other related technologies.
Academic Projects (2004-2006)