Informatica (2011 - Present)

  1. Cloud Data Governance and Catalog (2020 - Present)

  2. Enterprise Data Catalog (2016 - 2020)

    • A machine learning-based data catalog that lets you discover, enrich, classify and organize data assets across any environment to maximize data value and reuse, and provides a metadata system of record for the enterprise.

    • Data Catalog helps organizations find and govern data using an organized and enriched inventory of data assets across the enterprise. This is a self service solution where data consumers can easily discover, understand, track and govern the data assets spread across multiple source systems.

    • Key Features:

      • Scan and index metadata,

      • Discover and profile, and

      • Provide detailed lineage across tens of millions of data sets.

      • Provide a metadata system of record for the enterprise with a catalog of catalogs.

    • Environment: Java, Spring, Hadoop, HBase, Janus Graph, Apache Solr

    • Product Link:

    • Data Sheet -

  3. Enterprise Data Lake (2015)

  4. Generic Connectivity SDK (2011 - 2014)

  5. Connectivity Solution to MS Dynamics, Netezza, Salesforce. (2011 - 2014)

Teleonto Technologies Pvt. Ltd. (2006 - 2010)

  1. CApTOR (A Revenue Assurance Product from Teleonto)

    • CApTOR™ is an integrated Revenue Assurance product that can be used to proactively detect and prevent revenue leakages that occur at various touch points in the collection systems

    • Environment: Lisp, C++, Java, ApT (An SOA framework in Lisp), Allegro Graph, Allegro Cache, PostgreSQL, GridSQL.

    • Key Features:

      • Appliance based Shared Nothing Architecture to support Massive Parallel Processing and easy scalability.

      • SOA based Layered Architectural Design.

  2. User Traffic Analyzer

    • An analytic tool developed for Revenue Assurance Analyst to analyze the calls (Call Detail Records) of Telco's subscribers made over a period of time. Calls are processed and stored in graph database in an easily searchable/traversable manner.

    • Environment: Lisp, ApT, Allegro Graph.

  3. Flexible and Customizable Rating Engine

    • Lead a team of four people in developing a Rating Engine which can be easily customizable to a Telco. Used the distributed rating architecture where a Rating Manager distributes the load across several Rating Servers.

    • Environment: Lisp, ApT (SOA framework), Lisa (Rule Engine).

  4. CApTOR – Workflows

    • Workflows is the core functionality of the CApTOR based on service oriented architecture. It acts as intelligent middleware between GUI and core functional components (like FTP Pull, Mediation, DB Loading, Reconciliation and Rating) and distributes the work to multiple cores/cpus.

    • Environment: Lisp, Java, ApT (An SOA framework in Lisp), JavaScript, Jakarta Tomcat

  5. Generic distributed Reconciliation Software

    • Developed a generic distributed reconciliation engine which can do a reconciliation of two CSV (comma-separated values) sources and produce a detailed classification report.

    • Environment: Lisp, C++, ApT (An SOA framework in Lisp), Allegro Cache (An object database)

  6. DB Layer & Multi-dimensional Data Summary

    • As a member of a very small team, developed a Database Layer for CApTOR (Telecom Revenue Assurance Product) using an object oriented database (Allegro Cache) to load the full daily data (150 Million records) into Allegro Cache, create a multi-dimensional summary and a query interface to the summary and base data. One month of data (4.5 Billion CDRs) can be loaded and queried.

    • Environment: Lisp, ApT (An SOA framework in Lisp), Allegro Cache

  7. Roaming Analytics using Ontology

    • Analyzes the behavior of In Roamers of a Telecom operator to provide quality of service and attract them. Modeled an ontology and developed a knowledge base using

      • OWL (Semantic Web) and a graph database (Allegro Graph)

      • Frame based (or Class Based) modeling and an object database (Allegro Cache)

    • Developed the whole system from scratch. Was able to show significant results within 2 weeks. Research for this project included Semantic Web, Clustering techniques, use of OWL, Racer, and other related technologies.

Academic Projects (2004-2006)