Author: admin

  • Bluetooth Chipset Firmware

    Bluetooth Chipset Firmware

    Objectives

    • Chipset firmware to control Bluetooth radio operations
    Existing Challenges

    • China manufactured chipset bugs and electrical inconsistencies
    • Mass-production timeline windows
    Solutions

    • Custom host controller diagnostics code
    • Custom Bluetooth protocol stack firmware
    • FPGA integration testing
    Benefits

    • Stable Bluetooth device functionality
    • In-house codebase for future feature development

  • Embedded Linux OS driver development

    Embedded Linux OS driver development

    Objectives

    • Generic drivers for linux-based embedded OS’s for bespoke chipsets
    Existing Challenges

    • Varied Debian and Redhat linux distro derivatives
    • Custom microcontroller pinset controls
    • Buggy chipset firmware
    Solutions

    • Cross-compiled build from x86 hosts to embedded cpu’s. i.e. ARM, PIC, Motorola, etc.
    • Custom Firmware plugins. i.e. Assembly, C++, C
    • Bespoke test harness rigs
    • FPGA integration w Matlab, Electric networks
    Benefits

    • Stable device integration and functionality
    • In-house code for future feature and fix development

  • Device Diagnostics Harness

    Device Diagnostics Harness

    Objectives

    • Device harness firmware for post-assembly-line device quality assurance testing and RMAs
    Existing Challenges

    • Buggy chipsets from China manufacturing
    • No automated control mechanisms from host
    • Custom FPGA design and integration details
    • Mixed chipset architectures. i.e. ARM, PIC, etc.
    Solutions

    • C++ bespoke firmware for harness controller
    • Assembly code for functionality chipsets interfaces, i.e. wifi, bluetooth, hid’s, screen, usb I/O, etc.
    • Custom electric network integration
    • Windows .NET host controller application and utilities
    Benefits

    • Dual-purpose harness for device testing during dev and mass production
    • In-house code for future feature and fix development

  • AWS Data Science Analysis and ML Pipeline Platform: Databricks Spark, Sagemaker

    AWS Data Science Analysis and ML Pipeline Platform: Databricks Spark, Sagemaker

    Objectives

    • Cohesive environment for time-effective Data Science experiments on big data
    • Production env capable ML pipelining
    Existing Challenges

    • MLops challenges. i.e. IAM policies, resource configs for model type, feature stores
    • Long experimentation cycle time
    • Data Scientists lack independence in data procuration and library setup
    • Lack of Production env capable processing
    • Disparate data sets
    Solutions

    • Exploratory work and development of models are done via SageMaker Studio by Data Scientists
    • SageMaker Studio SSO and MFA integration and isolated S3 paths satisfy enterprise dev ops compliance
    • Databricks for big-data batch processing and S3 for training dataset storage
    • Productionized jobs via Sagemaker py API deployable via traditional existing CICD
    Benefits

    • Time-effective data science work
    • Productionized models maintainable by staff DE and Ops teams

  • Big Data and Financials Analysis Platform

    Big Data and Financials Analysis Platform

    Objectives

    • AWS hosted platform focused on the storage, processing, and presentation of customer and product data
    • Spark Vendor Databricks implementation and workflow
    Existing Challenges

    • DB Storage costs
    • Long Pipeline runtimes
    • Multiple biz unit data accessibility and separation
    • PII, SOX, etc. regulatory compliance
    • Public enterprise IT, infosec compliance
    Solutions

    • Databricks for Spark big data processing, Jupyter Notebooks analysis, Hive tables on S3 for SQL
    • S3 for Data Lake source of truth storage and Utility staging and processing storage
    • Redshift for Data Warehouse availability to different business unit dashboarding and reports
    • ECS Containers for bespoke native codebase applications
    • Terraform IaC
    • Jenkins code releases and env separation
    Benefits

    • Analyst and DE accessible
    • Scalable
    • Enterprise compliant
    • Cost manageable’
    • Flexible

  • DE Staff Re-org Strategy and Workflow

    DE Staff Re-org Strategy and Workflow

    Objectives

    • New DE and Analytics department workflow
    • Re-structure Stakeholder Analytics requests workflow and implementation
    • Re-structure Data Modeling and Development for reporting workflow and implementation
    Existing Challenges

    • Balancing new biz objectives and priorities with existing workloads and operations
    • Inter-business-unit technical bureaucracies and politics
    Solutions

    • Evaluated existing ETL operations for new initiatives. i.e. bronze, silver, gold executive discoverable datasets
    • Formed dev squads, funneled stakeholder requests through analysts with PMO oversight
    • Jira Agile sprints to properly estimate and triage the work
    Benefits

    • Visibility of all operations across BI units
    • Accountability of resources across BI units
    • Efficient priority management relative to available resources

  • AWS Matillion, SQS, Lambda, Redshift, CDK

    AWS Matillion, SQS, Lambda, Redshift, CDK

    Objectives

    • Low-code rapid prototype ETL ecosystem for pilot data analyst effort
    Existing Challenges

    • Existing dev resources are tied up
    • ccess and ops bureaucracy takes a while
    Solutions

    • Services, assets and CDK IaC, IAM setup and workflow
    • Data-Mart solution for self-service isolated dataset BI work
    Benefits

    • Enclosed ETL ecosystem analysts can track and use
    • Big data capable

  • AWS Media Billing Platform

    AWS Media Billing Platform

    Objectives

    • Programmatic Ad Sales delivery and billing reporting codebase
    • Fix and Refactor legacy codebase for speed, bug fixes and feature enhancements
    Existing Challenges

    • Debugging BI logic on big-data required python pandas dataframe live debugging. i.e. Analysts can’t view and debug reporting issues without a developer
    • Pipeline execution time is too long
    Solutions

    • Git Action/Terraform CICD, EKS Container, Py Pandas ETL to/from incremental S3 Glue-db tables w Airflow orchestration
    • Reworking of business dataset analysis workflow from Py Pandas DFs to Glue/Athena tables
    Benefits

    • Analysts can now work directly with data via SQL in Glue DB Tables. Once fixes are found the logic can be integrated by developer via normal sprint workflow
    • Pipelines are Reliable and Fast

  • AWS Big Data Media Platform

    AWS Big Data Media Platform

    Objectives

    • Goal was to replace Databricks vendor platform and redshift-centric ecosystem to a native AWS EMR,JupyterNotebooks hive-centric and redshift-datamart ecosystem.
    • Improve data analysis access, and ETL speed, reliability and cost effectivenss
    • Medallion (bronze, silver, gold) data triage for engagement, ad-sales and content data domains
    Existing Challenges

    • Disparate Data Access: Can’t easily gain access and query data across domains
    • Database Issues Slow queries. queries lock up or time out. Db load competition. Usage limits
    Solutions

    • AWS EMR w DBT Spark ETL. EMR Jupyter Notebooks for analysis. Git Action w Terraform CICD.
    • S3 Glue-dbs for lake storage. Redshift datamart reporting storage
    Benefits

    • Long-term Flexibility, Reliability, Robustness
    • Cost Manageable and Flexible

  • Data Enterprise Mgmt post

    Data Enterprise Mgmt post

    Section 1.10.32 of “de Finibus Bonorum et Malorum”, written by Cicero in 45 BC

    “Sed ut perspiciatis unde omnis iste natus error sit voluptatem accusantium doloremque laudantium, totam rem aperiam, eaque ipsa quae ab illo inventore veritatis et quasi architecto beatae vitae dicta sunt explicabo. Nemo enim ipsam voluptatem quia voluptas sit aspernatur aut odit aut fugit, sed quia consequuntur magni dolores eos qui ratione voluptatem sequi nesciunt. Neque porro quisquam est, qui dolorem ipsum quia dolor sit amet, consectetur, adipisci velit, sed quia non numquam eius modi tempora incidunt ut labore et dolore magnam aliquam quaerat voluptatem. Ut enim ad minima veniam, quis nostrum exercitationem ullam corporis suscipit laboriosam, nisi ut aliquid ex ea commodi consequatur? Quis autem vel eum iure reprehenderit qui in ea voluptate velit esse quam nihil molestiae consequatur, vel illum qui dolorem eum fugiat quo voluptas nulla pariatur?”