Bigquery Jobs
Daily filing of various CSV data for individual queries and display of historical portfolio developments tägliches Datenupdate verschiedener Quelldaten CSV mit Archivierung und Abfragen
...und Optimierung unserer Google Cloud Umgebung. Die App wird durch eine Data Pipeline abgebildet, die Daten über eine HTTP Schnittstelle entgegennimmt, durch mehrere ETL-Prozesse schleust und die Resultate schlussendlich in BigQuery ablegt. Das Projekt besteht aus zwei Teilbereichen: - bestehende Scripts für die Infrastruktur mittels Terraform abbilden - Optimierung der vorhandenen Infrastruktur (Dataflow, VPCs, Monitoring, Alerting) Wichtig ist uns, dass sehr gute Kenntnisse in den folgenden Bereichen vorliegen: - Terraform - Dataflow - PubSub - Compute Engine - BigQuery...
...land everything in BigQuery with the same—or better—performance, data quality, and monitoring we have today. Here’s what I need from you: • Assess the current SSIS packages, map out every source table and file feed, and document the equivalent flow on GCP. • Design and build the new pipelines using native services (Dataflow, Cloud Data Fusion, Cloud Composer, or any combination you feel is best) with full error handling, logging, and retry logic. • Automate scheduling and environment-specific configuration so deployments to dev, test, and prod are effortless—Terraform or Deployment Manager scripts are welcome. • Provide hand-over documentation plus a short walkthrough so my team can operate and extend the solution. You should ...
...performance monitoring—so shoppers are consistently routed to the correct territory. What I’m looking for You should have hands-on experience with large ecommerce platforms (ideally in the automotive or parts sector), a track record of solving crawl/index challenges at scale, and advanced knowledge of technical SEO tools such as Screaming Frog, Log File Analyzer, Google Search Console, and BigQuery or similar for large data sets. To be considered, please share a brief introduction along with two or three ecommerce SEO projects you’ve led that involved millions of URLs or similarly complex site architectures. Highlight the specific wins you delivered—whether it was index cleanup, rich-result gains, or international rollouts. I’m ready to move q...
I’m looking for an experienced hand to take my existing Google Tag Manager container and q...scroll-depth, outbound click, and any other essential GA4 events you recommend). • Creating and analyzing reports – once the data is flowing into Google Analytics 4, I want meaningful Looker Studio dashboards that surface traffic sources, conversions, and user journeys at a glance. • Data harvesting, storage, and reporting – advice or light configuration around routing the GA4 stream into BigQuery (or a comparable warehouse) so historical analysis is effortless. Everything must be configured through Google Tag Manager and tested so I can confidently roll it into production without extra debugging. If you’ve done similar projects and can deliver with...
I plan to run a practical-heavy Google Cloud Platform course that guides an intermediate audience through the core data-engineering services—BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, and Composer. The group already understands basic GCP navigation; what they need now is the industrial “how it’s really done” perspective: designing pipelines, optimising cost, securing data, and automating workflows at scale. The programme should balance theory with hands-on work. I expect each concept to be introduced with a concise architectural walkthrough and then reinforced by a lab where learners build or tweak a working pipeline on their own accounts. Please weave in best-practice checkpoints, common production pitfalls, and tips on monitoring with Cloud Logg...
...optimization Partitioning & indexing strategies Query acceleration techniques Caching layers Parallel query execution Horizontal scaling Suggested Tech Stack (Open to Better Suggestions) ========================================== Backend Python (FastAPI / Django) or Node.js Go (optional for performance-critical services) Database Options ClickHouse (preferred) Apache Druid Elasticsearch BigQuery / Redshift Any distributed columnar DB Frontend React / Advanced filtering UI Data grid with pagination & lazy loading Infrastructure Kubernetes / Docker Load balancing CDN Caching (Redis) Object storage for raw data Required Features ================ Advanced Search Engine Full-text search Multi-filter query builder Auto-suggestions Fuzzy matching A...
...the bootcamp is hands-on practice: together we will build, deploy, monitor and troubleshoot real ETL flows until I can run them solo with confidence. Primary learning goal My priority is data ingestion, extraction, transformation and storage. We will start with the full ingestion toolkit—DataFlow, Pub/Sub and Cloud Storage—then chain that work into the wider GCP ecosystem: • Storage layers: BigQuery, Bigtable, Cloud Spanner, Cloud SQL, Datastore / Firestore • Transformation & orchestration: DataFusion, DataProc, Cloud Composer, Cloud Scheduler, Cloud Functions • Data quality & cataloging: DataPrep, Data Catalog • Visualisation: Looker, Data Studio Preferred teaching style Live, screen-shared sessions where you demo, then watc...
...optimize data models for analytics and reporting Work with cloud data platforms (AWS, Azure, or GCP) Implement data validation, quality checks, and monitoring Build dashboards and actionable insights using BI tools Collaborate on performance tuning and data architecture improvements - Required Skills Strong experience with SQL and Python Expertise in modern data stack tools (e.g., Snowflake, BigQuery, Redshift, Databricks) Hands-on experience with orchestration tools (Airflow or similar) Experience with dbt or data transformation frameworks Knowledge of cloud platforms (AWS/Azure/GCP) Strong analytical thinking and problem-solving skills - Nice to Have Experience with real-time data processing (Kafka, streaming frameworks) Experience with machine learning pipelines ...
We are looking for experienced Data Analysts, Data Engineers, or BI Analysts to test and complete an onboarding flow for our new data platform. The goal is to generate a functional "context layer" using your own live data environment. The Task 1. Onboard: Connect your own Data Warehouse (Snowflake, BigQuery, Redshift, etc.). 2. Model: Go through a short modeling flow to create a context layer for your tables. 3. Stress-Test: Ask 30 natural language questions based on your data (e.g., "What was the YoY growth for [Product X] in Q3?"). 4. Validate: Provide feedback on whether the platform answered correctly. If it failed, you will identify why (e.g., wrong join, misunderstood metric, or schema ambiguity). Requirements (Strict) - Active DWH Access: You must have a...
Required is 3 main tasks: 1) complete the facebook ads connector to get all the data syncing in to BQ 2) setup TrustPilot Api to sync all the reviews into a new dataset 3) setup Facebook Page sync to get all the page data / comments etc.. coming into a new dataset
...record stores instantly. • Reporting & analytics: dashboards, charts, and exportable summaries that show SKU totals, variances, and historical counts at a glance. • Dual updates: staff adjust counts manually inside the app while scheduled automations pull fresh purchase/sales data from linked sheets or a simple REST/webhook to keep figures in sync. What I’ll hand you A clean Google Sheet (or BigQuery table if you prefer) containing sample SKUs, barcodes, and locations, plus access to a Workspace account for testing. What I need back • A scalable data schema and security rules. • A mobile-friendly UI with the AppSheet barcode component. • Automated workflows that post updates both ways. • Custom views and PDF/CSV report templates. • B...
...input, external databases (MySQL/PostgreSQL), and several REST-based APIs. Connection logic, authentication handling, and error catching must be built in. – Everything should run from a daily time-based trigger; I only want to intervene when a warning log flags an issue. – The solution must stay inside the Google Workspace stack (Apps Script, connected sheets, possible use of Apps Sheet or BigQuery is fine if it helps) so I avoid third-party subscription fees. – Clean, commented code and a short “how-to” doc explaining the trigger setup, variable sections, and where to extend data sources later. Acceptance criteria 1. One Google Sheet containing: raw data tabs, calculation tabs, and a read-only dashboard tab. 2. Apps Script bound to the fi...
...specifically, I need insight on: • Page views and navigation paths — where users enter, where they drop, and the common journeys in between • Interactions with elements (clicks, scrolls) — which buttons, links, or sections grab attention and which are ignored Raw data can be delivered in GA4, server logs, or CSV form; you pick the environment you feel most comfortable in (Python, R, SQL, BigQuery, or a BI tool such as Tableau, Power BI, or Looker Studio). I expect you to handle cleaning and modeling, run the exploratory analysis, and wrap up with a concise report or dashboard that highlights the key findings along with action-oriented recommendations. Acceptance criteria • Reproducible code/notebook or documented query set • Visualizati...
...Google marketing stack into Google BigQuery as a centralized data warehouse, with an AI-powered natural language query interface on top. ## Scope of Work **Phase 1: Data Integration to BigQuery** Connect and automate data pipelines from: - Google Analytics 4 (GA4) - Google Ads - Google Search Console - Google Tag Manager - Google Merchant Center Requirements: - Set up automated, near real-time data transfers - Design efficient BigQuery schema with proper data modeling - Implement ETL/ELT processes with data quality checks - Create unified views combining data across platforms - Import historical data - Document all data flows and transformation logic **Phase 2: AI Integration Layer** Implement AI-powered interface for natural language querying of BigQuery...
... Non-Negotiables 1. UEI is the identity anchor. 2. Do not dedupe by name. 3. If rules conflict or data is ambiguous: stop and flag before proceeding. 4. Follow SOP exactly. Working Style We want someone who: >thinks in SQL terms (joins, GROUP BY, keys, aggregation) >communicates clearly when data conflicts exist >prefers correctness over speed Tools You can use: a) SQL (Postgres/MySQL/BigQuery style acceptable) b) Or SQL + Excel/Sheets only for validation Output must be CSV. Budget Fixed USD 80 total, paid as 3 milestones: Milestone 1: Stage/load + USAspending L24M filter (deliver staging SQL + intermediate output) Milestone 2: Canonical vendor aggregation + Segment B/C logic + QA tie-outs Milestone 3: DSBS + SAM UEI joins + dial_phone + final enriched CSV Ho...
...Test: Connect the Airtable template to your local Postgres or MySQL instance. ● Scenario B: Local Analytics (Airtable → DuckDB) ○ Flow: DLT-Code Node (Source) → DuckDB Node (Destination). ○ Test: Materialize Airtable data into a local .duckdb file. ● Scenario C: Universal Connector (Airtable → DLT-Code Destination) ○ Flow: DLT-Code Node (Source) → DLT-Code Node (Destination Writer). ○ Test 1 (BigQuery): Credentials will be provided. ○ Test 2 (Databricks): You must use a free Databricks Community Edition account. 3. Requirements ● Accounts: The candidate is responsible for setting up their own free Airtable and Databricks Community Edition accounts for testing. 4. Deliverables ● The Template: Finalized DLT Python template code featuring the Airtable placehold...
...matching (hashed IDs, tokenization, etc.) o Controlled analytics and outputs only (no raw data sharing) 2. Evaluate Build vs Buy o Compare building from scratch vs existing solutions o Pros/cons, cost, scalability, vendor lock-in o Examples of real-world implementations 3. Technology recommendations o Cloud or hybrid architecture o Tools for privacy, access control, encryption o Possible use of BigQuery, Snowflake, AWS Clean Rooms, or custom stack 4. Security & Governance o Data access rules and roles o Auditability and traceability o Compliance-oriented design (financial / credit data context) 5. Deliverables o High-level architecture diagram o Recommended tech stack o Decision framework: build vs buy o Optional: roadmap for implementation Nice to have (big plus) • Expe...
...matching (hashed IDs, tokenization, etc.) o Controlled analytics and outputs only (no raw data sharing) 2. Evaluate Build vs Buy o Compare building from scratch vs existing solutions o Pros/cons, cost, scalability, vendor lock-in o Examples of real-world implementations 3. Technology recommendations o Cloud or hybrid architecture o Tools for privacy, access control, encryption o Possible use of BigQuery, Snowflake, AWS Clean Rooms, or custom stack 4. Security & Governance o Data access rules and roles o Auditability and traceability o Compliance-oriented design (financial / credit data context) 5. Deliverables o High-level architecture diagram o Recommended tech stack o Decision framework: build vs buy o Optional: roadmap for implementation Nice to have (big plus) • Expe...
I need a Google Cloud / BigQuery specialist to stand up an end-to-end, webhook-driven data ingestion pipeline running in our production environment. When our external form system fires a webhook, your Cloud Function (or equivalent service) should capture the JSON payload, write the untouched record to a raw BigQuery table, then immediately process it. The processing step must • parse any nested JSON, • flatten and clean each answer field, • split the results into two purpose-built reporting tables, and • guarantee idempotency through a hashing technique that blocks duplicates. All components have to be secure, version-controlled, and able to scale with traffic spikes. This is strictly a backend/data-engineering job—no UI work is involved.
...output should flow straight into a Google Cloud environment (BigQuery is ideal, but Cloud Storage plus Cloud Functions works too). After the upload, I want an exploratory analysis that highlights patterns, trends and any obvious outliers that deserve attention. Deliverables • Well-commented scripts or workflows that automate duplicate removal, spell-checking and format standardisation • A repeatable pipeline that loads the cleansed data into Google Cloud and can be triggered on demand • At least one concise insight report or dashboard with visualisations and written commentary • Clear documentation so I can rerun or extend the process without guesswork Python, SQL and native Google Cloud tools such as Dataflow, BigQuery, or Looker Studio are...
...We're a medium size ecommerce retailer running on Netsuite for our ERP and Big Commerce for our front end website. We're looking for a data engineer to help build a data warehouse for Netsuite, Big Commerce, and potentially some other data platforms to help us analyse our data faster and more accurately. We wish to engage a third party data Engineer to build a central Data Warehouse in Google BigQuery (or other similar service). We want to analyse data in a data visualisation tool such as Power BI. We are also interested in other capability that we will be able to use the data warehousing for in the future (better exception reporting, better customer analytics, new sales & marketing opportunities, etc) We have a capable internal IT/Analytics team, however the team...
...using Dataflow, BigQuery, and Cloud Storage to feed data into search indices. - Fine-tune ranking, indexing, and relevance using features like Vector Search (Matching Engine), embeddings, and semantic understanding. - Design "agentic" AI systems that can perform complex, multi-step tasks across data stores. - Configure search features like autocomplete, facets, and recommendations for our e-commerce and social media. Required Technical Skills: - Proficiency in Google Cloud Platform (GCP) and the Vertex AI suite (Search & Conversation, Vertex AI Studio, and Model Garden). - Mastery of customizing search functionality via APIs and SDKs. - Strong understanding of LLMs (Gemini), prompt engineering, tokenization, and vector-based information retrieval. - Experience wit...
I write SQL daily and feel solid with the basics, but when my event-stream tables collide in complex BigQuery joins I keep hitting errors—ambiguous fields, exploding row counts, and the occasional “resources exceeded” message. I want to sit down with someone who has already wrestled with BigQuery’s quirks and can show me, step by step, how to diagnose and fix these join issues on time-series data. Here’s what I’m hoping for: • A live screenshare or pair-programming session (1–2 hrs) where we walk through my current queries and pinpoint why the joins misbehave. • Clear explanations of BigQuery internals—how sharding, partition pruning, and JOIN-ordering affect performance and correctness. • Concrete take-away...
...metrics from Compute Engine, Cloud Storage and BigQuery. The focus is to track how close we are to exhausting available IP addresses and make that instantly obvious through chart-based visuals. The charts must change colour (or otherwise call attention) at 70 %, 80 %, 90 % and 100 % so no one has to guess when capacity is getting tight. Each of those four thresholds should also fire an email alert so the right people can act before things break. If you believe gauges or heatmaps would add clarity I’m happy to discuss, but my primary preference is clean, at-a-glance charts. Deliverables • A Dynatrace dashboard pulling live data from GCP that displays IP address utilisation plus key metrics for Compute Engine, Cloud Storage and BigQuery • Four th...
...human-resource information—up to date and error-free. The focus is on resource tracking: I need to see who is available, who is booked, and how utilisation is trending without having to touch a spreadsheet. Here’s the picture: • Source data lives in Excel/CSV files (and, if you can integrate it, a small HRIS API). • I want it validated, normalised, and stored in a structured repository—SQL, BigQuery, even Google Sheets if that’s the simplest. • A refreshed report or dashboard should surface the key tracking metrics for managers automatically, on a schedule we set together. You’re free to choose the tooling—Python, Google Apps Script, Power Automate, Zapier, or something comparable—so long as the workflow is maintaina...
I'm seeking an experienced SQL developer to optimize queries in BigQuery, focusing on complex joins and aggregations. Key requirements: - Expertise in BigQuery and SQL - Proven experience in query optimization - Strong problem-solving skills - Ability to work within budget constraints Ideal candidates should have a track record of improving query performance and can demonstrate their expertise with relevant certifications or project experience.
...probabilistic modeling, statistical inference, and experimentation frameworks (A/B testing, causal inference). Can collect, clean, and transform complex datasets into structured formats ready for modeling and analysis. Have experience designing and evaluating predictive models, using metrics like precision, recall, F1-score, and ROC-AUC. Are comfortable working with large-scale data systems (Snowflake, BigQuery, or similar). Are curious about AI agents, and how data can shape the reasoning, adaptability, and behavior of intelligent systems. Enjoy collaborating with cross-functional teams — from engineers to research scientists — to define meaningful KPIs and experiment setups. This listing is only for people residing in India. Primary Goal of This Role To desig...
...decision to be driven by data rather than guesswork. The focus is on two areas: • Inventory management – I need an AI model that forecasts stock needs, flags slow movers, and recommends optimal reorder points. • Sales analytics – I want clear, actionable insights on what sells, when, and to whom so I can quickly adjust marketing and pricing. Google AI is my preferred stack, so think Vertex AI, BigQuery, Looker Studio, and any other Google Cloud services that make sense. If there’s a smarter way to stitch these together, I’m open to your guidance. Deliverables 1. End-to-end setup of the inventory-forecasting pipeline, fully connected to my store’s data feed. 2. An interactive sales analytics dashboard with real-time or near-real-tim...
...strictes pour la réalisation du projet. Architecture & Stack Application Mobile (B2C) : Développement Cross-Platform (Flutter ou React Native) pour iOS/Android. Plateforme Web (B2B) : Application Web React.js ou Vue.js avec tableau de bord analytique. Backend & Data : Python (Django/FastAPI) pour la logique algorithmique. Base de Données : Architecture type "Data Warehouse" (PostgreSQL + BigQuery/Snowflake) capable de gérer des millions de lignes de transactions. Fonctionnalités Critiques à Développer Le Pipeline ETL (Extract-Transform-Load) : Script automatique qui nettoie et anonymise la donnée brute de l'App B2C vers le Data Warehouse B2B. Impératif : Suppression irréversibl...
...Google BigQuery. My current priority is data querying and retrieval, specifically: • Basic SELECT statements • Join operations • Aggregate functions with GROUP BY I’d like a mentor who can demonstrate each concept directly in BigQuery, assign short practice tasks, and review my work so I see where to improve. Explanations should stay clear and practical, gradually adding optimisation tips and BigQuery-specific features once the core techniques sink in. By the end of our time together I should be able to write clean SELECTs, build reliable joins, summarise data confidently, and understand how query plans reveal efficiency issues. A small set of completed exercises stored in a shared repo will serve as proof of progress. If you have expe...
...re-gridded to 1 km, stored in GCS & BigQuery partitioned by date. Training pipeline (Vertex AI custom job) Model: Temporal-Fusion-Transformer (PyTorch Lightning) – encoder 12 days, decoder 60 days. Physics-aware loss (optionally embed ERA5 precipitation as residual). Hyper-parameter search to beat ECMWF HRES RMSE by ≥ 15 % on held-out 2023 global data. Target skill: ≥ 80 % relative MAE reduction vs. baseline climatology (we will verify with independent rain-gauge split). Serving layer FastAPI container, autoscaling Vertex AI Endpoint (GPU T4, min 0). /predict – JSON in {“cube_id”:int, “past_seq”:[[float]]} → {“precip_mm_day”:[60 values]} latency < 200 ms p99. /health + automatic CI/CD via Cloud Build. Infrastructur...
...can refresh without manual steps. On the Alpaca side I need clean, version-controlled code that signs, submits, and monitors orders through their REST and streaming APIs, then routes any relevant events back into the app. Key deliverables I will be checking against: • A working code sample or module that authenticates to Google Cloud, proves access (for example, pulling a test dataset from BigQuery or writing to Cloud Storage), and can be dropped into my app. • A parallel module that logs in to Alpaca, places a dummy trade in paper mode, streams order updates, and surfaces the response to the UI. • Clear, step-by-step documentation covering credential creation, environment variables, and any third-party libraries, so I can reproduce the setup on another mach...
...into user behaviour, not just traffic counts or SEO metrics. The goal is to understand exactly how visitors move from page to page—where they start, the routes they take, where they hesitate, and where they drop off. In short, the focus is purely on user-flow analysis of our website data. You are free to work in Python (pandas, numpy, scikit-learn), R, or even directly inside Google Analytics / BigQuery if that speeds things up. Feel free to visualise flows with tools such as Tableau, Power BI, Looker Studio, or custom D3/Sankey charts—whatever makes the navigation paths crystal-clear. Deliverables • A cleaned, documented dataset ready for future reuse • Clear visualisations of the navigation flow (e.g., Sankey or funnel diagrams) • A concise re...
I need a Looker Studio (formerly Data Studio) specialist who can devote 30 consecutive business days to my account, working 6:00 AM–4:00 PM CST. During that window you’ll be hands-on inside the platform, unifying data from our CRM system, paid and organic social media channels, sales database, Google properties stored in BigQuery, and several custom APIs. Once the connections are stable and refreshing on schedule, the focus shifts to insight: you’ll design clear, role-based dashboards that highlight the metrics we run the business on—conversion rate, monthly recurring revenue, campaign-level marketing analytics, performance scorecards, and sales pipeline metrics. These views must update automatically and remain intuitive for executives and marketers who wo...
...through our approved dealer network or HomeCars Charities. ⸻ Step 3: Close & Drive Forward Finalize your purchase, receive your credit, and drive away with value already built in. No lenders. No approvals. Just transparency and opportunity. ⸻ Technical Requirements for Developer Backend • Scalable cloud infrastructure (AWS / GCP / Azure) • Database optimized for large datasets (PostgreSQL, BigQuery, or similar) • API-first architecture • Role-based authentication • High-availability and redundancy Frontend • Clean, modern UI • Fast loading • SEO-friendly • Responsive (mobile + desktop) • Clear dashboards for each user type Automation • Automated credit calculation engine • Automated document gener...
...through our approved dealer network or HomeCars Charities. ⸻ Step 3: Close & Drive Forward Finalize your purchase, receive your credit, and drive away with value already built in. No lenders. No approvals. Just transparency and opportunity. ⸻ Technical Requirements for Developer Backend • Scalable cloud infrastructure (AWS / GCP / Azure) • Database optimized for large datasets (PostgreSQL, BigQuery, or similar) • API-first architecture • Role-based authentication • High-availability and redundancy Frontend • Clean, modern UI • Fast loading • SEO-friendly • Responsive (mobile + desktop) • Clear dashboards for each user type Automation • Automated credit calculation engine • Automated document gener...
...Console, Metricool, Office 365 y BigQuery) y consolidará la información en un tablero ejecutivo en Stratio con indicadores clave de tráfico, comportamiento, ROI y ROAS. ALCANCE DEL PROYECTO Diseño e implementación del DataLayer (Capa de Datos): Configuración técnica en Google Tag Manager. Definición de variables y eventos personalizados (hasta 5 funnels). Configuración de funnels y trazabilidad externa (Office 365 / links): Medición de formularios externos y redirecciones fuera del dominio principal. Trazabilidad entre fuentes y registro de eventos externos. Integración de plataformas analíticas y publicitarias: Conexión y validación técnica de GA4, Clarity, Search Console y ...
...Builder internals and limitations Tool calling and function schemas (advanced patterns) File, browser, code, retrieval and custom tool integration Prompt layering: system → developer → agent memory Guardrails, refusal handling, and safety controls Cost optimization and latency trade-offs 3. Gemini Agent Builder / Vertex AI Agents Gemini agent architecture vs OpenAI agents Native connectors (BigQuery, GCS, Google Drive, APIs) Tool invocation and grounding with enterprise data Differences in memory, reasoning, and orchestration Strengths/weaknesses vs OpenAI agents 4. Connectors & Data Integration (Critical) Designing scalable connectors (APIs, databases, SaaS tools) Retrieval-augmented agents vs tool-based agents Sync vs async data flows Security, per...
...read those three parameters on the ESP32 at a sensible sampling rate, push them securely over Wi-Fi, and have them land in my Google Cloud account for long-term storage and future dashboarding. Although my original note mentioned AWS, I’ve decided to proceed with Google Cloud instead, so please use the tools you feel most comfortable with there (IoT Core, Pub/Sub, Cloud Functions, Firestore, BigQuery—whatever keeps the solution simple and maintainable). TLS encryption and device authentication are mandatory. I’ll provide: • The hardware (ESP32, energy-meter IC, power supply) • Wi-Fi credentials for bench testing • Access to a fresh Google Cloud project What I need back: • Fully commented ESP32 firmware (Arduino or ESP-IDF) that cap...
...using Dataflow, BigQuery, and Cloud Storage to feed data into search indices. - Fine-tune ranking, indexing, and relevance using features like Vector Search (Matching Engine), embeddings, and semantic understanding. - Design "agentic" AI systems that can perform complex, multi-step tasks across data stores. - Configure search features like autocomplete, facets, and recommendations for our e-commerce and social media. Required Technical Skills: - Proficiency in Google Cloud Platform (GCP) and the Vertex AI suite (Search & Conversation, Vertex AI Studio, and Model Garden). - Mastery of customizing search functionality via APIs and SDKs. - Strong understanding of LLMs (Gemini), prompt engineering, tokenization, and vector-based information retrieval. - Experience wit...
I'm seeking an experienced Google Cloud Platform analyst, specifically for BigQuery and cloud fusion. I need help getting knowledge on how to map and transform data in GCP. Key Requirements: - In-depth analysis of BigQuery - Insights on performance, and data mapping. Ideal Skills & Experience: - Extensive experience with BigQuery - Proven track record in cloud data analysis - Ability to provide actionable insights Please include relevant past work in your application.
I need a fully-automated analytics stack that funnels data from GoHighLevel, Stripe, Google Analytics 4, and Whop into BigQuery and then visualises the key KPIs in Looker Studio. The warehouse should be designed primarily for reporting—clean, well-modelled tables that refresh on a reliable schedule—so I can quickly slice everything from acquisition costs to sales-team close rates without wrestling with raw exports. Scope of work • Build or configure the ETL pipelines that extract each source’s data, load it into BigQuery, and transform it into a unified schema. If an off-the-shelf connector (Fivetran, Airbyte, native GA4 export, etc.) is the best option, set it up; if custom Cloud Functions or SQL is more practical, document the logic clearly. &b...
In this milestone, I will develop the initial data pipeline for AVM and set up the BigQuery schema according to the mandatory architecture. This includes: Creating BigQuery tables with partitioning by assessedYear and clustering by zipCode, propertyType, and yearBuilt. Writing a cloud-native Python script to process raw parcel data, historical sales data, and external API inputs. Ensuring that the data pipeline is clean, modular, and ready for integration with the computation engine in the next milestone. Deliverables will include the script, BigQuery table structure, and a brief document explaining the setup and how to run it.
...Geolocation + proximity engine. Coupon & gamification rules engine. Event management. Business directory management. Incident/SOS management. Notification engine (email, SMS, WhatsApp, web push). Microservices for independent scaling (SOS, maps, notifications). 3.3 Data & AI Layer Front-end: Vue. Operational DB: Supabase (users, sessions, scans, events, coupons, SOS). Data warehouse (BigQuery/Redshift/Snowflake). ETL/ELT for logs ingestion, cleaning, anonymization. AI models: recommendations, predictive heatmaps, under-activated zone detection, anomaly detection (security/fraud). 3.4 Integration Layer Municipality systems: licenses, tourism, culture, security, civil protection. Emergency & medical: C5/911, Red Cross, hospitals (API/web console). Third ...
I'm seeking an expert to assess and recommend a new data warehouse and machine learning platform to replace our current SAS Base system. Id...running data and licenses) - Migration time frame - Training/skills required - Ability to deploy/develop dedicated model etc Key Requirements: - Handle data storage, ETL, visualization, reporting, and ML modeling - Blend seamlessly with AI tools - Scalability, performance, and ease of integration are critical - Open to various vendors (AWS, Google Cloud, Azure, Databricks, Snowflake, BigQuery, Azure Viya) Ideal Skills: - Experience with modern data platforms - Strong knowledge of ETL processes and ML - Familiarity with AI integration - Vendor comparison expertise Your insights will help us choose the right platform...
...it every day, so every millisecond counts. The core stack is Ruby on Rails and Go running on both AWS and GCP. Data flows through event-driven pipelines into DynamoDB and BigQuery and finally lands in email templates that you will own end-to-end. Here is what I need built or improved right now: • New and refactored micro-services in Go or Rails that can fan out millions of alerts without breaking latency targets. • Health-checks, metrics and auto-scaling hooks wired into CloudWatch, Stackdriver or your preferred observability tool so we always know where we stand. • Robust connectors for DynamoDB streams and BigQuery jobs to keep the data pipeline real-time. • A small library of maintainable, parametrized email templates that marketing can twea...
...running. The previous developer completed the setup and delivered working scrapers, schedules, and a BigQuery connection. I need a developer to review the build, stabilize it, and finish the remaining pieces. What is already done: Apify organization created Two working Actors running on schedule Facebook Posts Scraper Instagram Profile Scraper Existing saved tasks Storage dataset creation Service account permissions set BigQuery dataset ready Tables receiving data daily What I need completed: Confirm Actors run without failure Add retry logic and error handling Clean and structure the scraped data Write transformations for sentiment and keyword flags Load cleaned data into final BigQuery tables Build a Looker Studio dashboard using these tables Doc...
...implementing automation workflows, data analytics, and conversion funnels. We currently have a Webflow-based website that includes a custom-built Learning Management System (LMS) and use Memberstack for authentication and gated content. We manage back-end automations primarily through Integromat (Make) and store data in Airtable. All user data is tracked and analyzed via Google Analytics and stored in BigQuery for deeper insights. We also rely on a Node.js backend, hosted on AWS, for complex operations and custom logic that other no-code tools can't handle. For search, we sync Webflow content with Elasticsearch for fast, scalable search capabilities. Additionally, we leverage a custom AI assistant powered by GPT for course content interactions, and we utilize SendGrid for...
CA civil rights law firm is building an internal tool that auto‑captures billable legal work from Google Workspace, Slack, Zoom, and Google Voice, drafts proposed billing entries to an accept reject/edit user interface, and posts approved items to our timekeeping software, Clio. ...time. • Review UI: Approve / Edit / Split / Reject with evidence links and audit trail. • Clio integration: create Activities on approval; prevent duplicates. Relevant experience would include: • Google Workspace APIs (Calendar watch, Drive Activity, Gmail watch, Meet/Reports), Zoom REST, Slack Events, Clio API v4. • GCP (Cloud Run/Functions, Pub/Sub, Secret Manager/KMS, Firestore/BigQuery), OAuth 2.0, RBAC, logging. • Strong security hygiene; can document scopes, ...