Profiles from Search

Profile photo

Pooja Murarka


Previous positions

  • Lead Engineer at Opera Solutions
  • Senior Software Engineer at Opera Solutions


Inderprastha Engineering College, Bachelor of Technology (B.Tech.), Information Technology




Experienced Software Engineer with 9+ years hands-on experience. Working as a key developer of the (Signal Hub) Big Data Platform team with deep knowledge of big data system development. Posses strong skills to analyze and implement business and project processes implementing Data pipelining. Expertise on developing analytic solutions on Hadoop framework using Big Data Tool for major clients in Airlines, Asset Management and Pharmacy Domains. Software Skills: • Expertise in Java, Python, Scala and working as a key developer in Big Data Platform team. • Deep Knowledge of SPARK and Hadoop Framework for developing and deploying Big Data Solutions effectively. • An expert in developing Data pipelining solution and solving issues related to data skewness, troubleshooting and debugging Hadoop cluster manager, different errors and bugs which reduces the overall development time to meet critical deadlines. • Developed and Implemented various algorithms and transformations in different Data Engines. • Experience in Solution design, Data Analytics, Data Flow Design, Data Modeling, Data Mapping, Business Intelligence, ETL Development and profiling of scalable applications. • Extensive Knowledge in Data Profiling, metadata driven Data Integrity test and developing workflows performing ETL on data received from multiple sources systems (format: xml,delimited, fixed-width, avro, S3, Parquet). • Experienced in making the key decision involved in converting logical entity to physical entity with maximum parallelism which is necessary to deliver the fastest solution. • Experience in Managing and Mentoring team of 7+ people which includes work assignment, code review, governance and goal setting. • Extensive knowledge of various NoSQL data stores and advantages and disadvantages of each which is necessary to make the right architectural and design decisions.


  • Software Development Engineer ||


    March 2019 – Present(7 months)San Diego

  • Lead Engineer

    Opera Solutions

    February 2016 – Present(3 years 8 months)San Diego

    Key developer of the “Signal Hub” platform – a Big Data Analytics and Signal Processing Framework, implemented using following technologies: SPARK, Hadoop, Java, Scala, Python, Databases, Maven, GIT/SVN. Working on transforming platform to run with SPARK. Redesigned the messaging framework Converted DataMaps to templates Using Velocity which considerably curtailed running time exponentially. Scripts for automation, DDL generation for hive and impala corresponding to data residing on HDFS.

  • Lead Engineer

    Opera Solutions

    July 2015 – February 2016(7 months)Noida Area, India

  • Senior Software Engineer

    Opera Solutions

    October 2014 – October 2015(1 year)Noida Area, India

    Worked as a developer (Java) for the Big Data Platform and involved in adding new features/improvements to the platform (Signal Hub). Developed analytic solutions which involved designing data flows along with Cluster (AWS, Datarush) setup (solution deployment ), monitoring and performance optimization. Also, developed data flows to implement end to end process (from ETL to database loading). That data can further be an input to web service or UI. (Client: United Airline) Fetching lot of stock ticker data from Google & Yahoo for over 850 tickers, including the news articles about them. Stored this info into the HDFS, wrote MapReduce program to find the number of positive and negative words in the news articles and correlate that with the ticker price pattern using Hadoop Streaming API and deployed on AWS EMR. Also, implemented a Hive query platform on top for querying the ticker price data. Technologies: Java, Python, Linux, Hadoop, ETL

  • Software Engineer

    Opera Solutions

    May 2013 – September 2014(1 year 4 months)Noida Area, India

    As an ETL developer/Automation of flow for Investment Asset Management, which is a professional management platform for traditional (stocks, bonds, cash, real estate) and alternative (commodities, hedges, funds, private equity, venture capital) assets. Setup and Deployment it on a cluster AWS (12 nodes) for enhancing the performance. (Daily data size ~20 GB) Developed a workflow using Bonita (BPM tool). Played a key player role for managing responsible for backend services layer along with the evaluation of different BI tools (Pentaho, LogiXML), databases (Vectorwise, InfiniDB) and have worked closely with the business team for understanding the requirements and developing backend expertise across various spend categories.

  • Data Developer

    Opera Solutions

    February 2011 – April 2013(2 years 2 months)Noida Area, India

    Designing of VSA Workflow solution for the Procurement group of Ameriprise Financial. Procurement serves many functions at Ameriprise including evaluating the risks e.g. Information Security, Corporate (Physical) Security, Business Continuity, Financial Health, Legal risks, etc. of vendors that provide services to Ameriprise. At the present time VSA is largely orchestrated and executed through disjointed and manual processes. Evaluated various BI tools like YellowfinBI, LogiXML, Pentaho, Elixier, Tableau. Used Talend (ETL tool) to load data in staging and then to Infinidb database

  • Product Engineer

    Symphony Infospace Pvt Ltd

    December 2009 – February 2011(1 year 2 months)Bangalore

    Worked in J2EE, Java, HIBERNATE, Spring as a Developer. The work involved following- Requirement gathering, analysis and jotting down of key observations for implementation. Designing and coding using Java, J2EE, Hibernate, Spring and deployment. Play key role in Unit testing, Integration testing and bug fixing.


  • Inderprastha Engineering College

    Bachelor of Technology (B.Tech.), Information Technology

    2005 – 2009

    • Won Excellence award for securing Ist position in academics in IT branch and aggregate IT branch. • Developed Desk Jockey Tool – a web UI application revolving around employee (payroll processing, leave management, reports generation, efforts calculation). • Developed Seating Arrangement – a web UI application for implementation of a seating plan used in the examination systems. It involved designing, coding and implementation as per the specific requirements of the UPTU University.

    Activities and Societies

    Attended session on “Dream Spark” honored by Bill Gates at IIT Delhi, Nov 2008. Participated in “Build your personal brand” Workshop by Tech Tribe and Jaypee Business School, March 2007.

  • Greenway Modern Senior Secondary school

    Higher Secondary, Physics, Chemistry, Mathematics and Computer Science

    2003 – 2005

Skills & Expertise

  • ETL Tools (Talend, Data Rush)
  • Databases (MYSQL, Vectorwise, Microsoft SQL Server)
  • Linux
  • Big Data Analytics
  • python
  • Big Data
  • Workflow Engine (Bonita)
  • Java
  • Hadoop
  • Business Intelligence Tools (Pentaho, Tableau, LogiXML)


  • Data Structures and Performance

    Coursera, License PPBDLQ5MM4CG

    May 2017