Your GCP Partner for AI & Data Innovation

I specialize in devlivering powerful solutions on the Google Cloud, enpowering you to leverage the full potential of Vertex AI and advanced data engineering. My expert team provides pre built solutions and ongoing support to ensure your success.

Accelerate Your Innovation Journey



Why Choose Me?

Find my solutions on

  • Unlock Your Data's Potential: I empower you to extract meaningul insights and drive data driven decisions, leading to tangivle business outcomes.
  • Deep GCP Expertise: I am an expert in leveraging Google Clouds AI and data services.
  • Proven Solutions: My solutions are built on industry best practices and proven methodologies.
  • Dedicated Support: I provide comprehensive support to ensure your success.
  • Seamless Marketplace Integration: Deploy and mange my solutions direcly within GCP.

Fiverr Freelancer Marketplace I also provide freelance services on the platforms listed below to businesses:

Cross-Functional Collabs

Collaboration between technical and business teams.

Freelancing Projects
  • Fiverr Cloud Architect
  • Fiverr AI / MLOps / DevOps
  • Freelancer Data Scientist
Customer-Centric Agility

Build a team that can adapt to changing and understand customer needs.

Specialized Development
API, Python, Ruby on Rails, Scala, Swift, SQL/NoSQL, Typescript


Portfolio

This is where I showcase my work, personality, and talents all in one place. It is to prove skills, share enthusiasm about the profession, give potential employers, and clients a glimpse of how it would be to work with me. Check out my LinkedIn and resume' to focus on my experience. For short-term projects or assistance find me on Fiverr, or contact me

Web 99+ Games 5+ Services 9+
Livestream 2+ AI 4+ Training 49+
MLOps 4+ Marketplace 9+ Other 99+

WEB

Textual, visual, or aural content that is encountered as part of the user experience on websites. It may include—among other things—text, images, sounds, videos, and animations.

99+ items

GAMES

Electronic game that interacts with a user interface or input device (joystick, controller, keyboard, or motion sensing device) to generate visual feedback shown on a video display device, TV, monitor, touchscreen, or virtual reality headset.

5+ items

SERVICES

On demand computer program designed to carry out a specific task relating to the operation of the device itself, typically to be used by end-users.

9+ items

LIVESTREAM

Online streaming media simultaneously recorded and broadcast in real-time or any media delivered and played back simultaneously without requiring a completely downloaded file.

2+ items

AI

Software designed to help to combine data engineering, data science, and ML engineering workflows to manage, develop, deploy, interpret, and monitor the models.

4+ items

TRAINING

Different kinds of exercises to practice and improve your coding skills. Some types of exercise are well known, but others aren't as widely used as they should be.

49+ items

MARKETPLACE

A digital store where users can discover, deploy, and manage a wide range of third-party software solutions on the Google Cloud Platform (GCP).

4+ items

ML-OPS

The process of managing the machine learning life cycle, from development to deployment and monitoring.

4+ items

OTHER

Any other collection of instructions and data that tell a computer or device how to work and 100 days of code challenge.

99+ items

BLOG


Get Fired Up: Google Wins and AI Adventures!

"...Seriously, its been a wild ride of cloud wizardry and AI breakthrough!"

Continue...

Gaming

New Releases

4/2025

All upcoming titles and DLC on PS5, Xbox Series X, PS4, Xbox One, PC, and Switch in a list.

Continue...
Development

Game Server

4/2025

Lessons learned after discovering GKE, Agones, and Vertex AI #GameServer

Continue...

From the Laptop

Get Fired Up: Google Wins and AI Adventures!

... Seriously, its been a wild ride of cloud wizardry and AI breakthrough!

Alright, folks! Friday is here, and I'm buzzing with excitement to share what's been happening this week. Seriously, its been a wild ride of cloud wizardry and AI breakthrough!

1. Cloud Migration: We Crushed It!

First off, we tackled our cloud migration optimization head on. And guess what? We absolutely crushed it! We dug deep into the numbers, tweaked the processes, and boom! We slashed timelines and saved a ton of resources. It's like finding hidden treasure when you see those optimizations pay off.

2. GKE + AI: Model Magic in Action!

Then we dove into deploying our AI models on Google Kubernetes Engine (GKE). Watching those models come to life in a scalable, rock solid environment? Pure magic! GKE is seriously powerful, and seeing our AI in action is just...WOW.

3. FaaS: Serverless Superpowers!

Let's talk serverless! We got our hands dirty with Functions as a Service, and its been a game-changer. Building those nimble, on demand functions has streamlined so much of our workflow. Its like giving our microservices superpowers.

4. Terraform: infrastructure, Sorted!

And finally, we gave our infrastructure a serious upgrade with Terraform. Automating everything? It's like having a well oiled machine. Consistency, reliability, efficiency - Terraform delivers!

In a Nutshell:

This week? It's been an absolute blast! We've pushed boundaries, solved tough problems, and witnessed some seriously cool tech in action. I'm so stoked about the progress we've made, and I can't wait to see what next week brings.

Have an awesome weekend, everyone! Let's keep innovating!



New Game Releases

APRIL 2025*

  • The Last of Us: Part II Remastered (PC)
  • Indian Jones and the Great Circle (PS5)
  • Fatal Fury: City of the Wolves (PC)(PS5)(PS4)(XBox X | S)
  • Clair Obscur: Expedition 33 (PC)(PS5)(XBox X | S)
  • The Talos Principle: Reawakened (PC)(PS5)(XBox X | S)
  • Lunar Remastered Collection (All Platforms)
  • South of Midnight (PC)(XBox X | S)
  • Promise Mascot Agency (All Platforms)

What I'm playing and obsessed with:
  • Nothing... Minecraft is too tough and I'm to busy to play the OG Call of Duty's
  • Youtube Playables when waiting for processing there is one about Parking and a Farm game I like

Rewards and Challenges:

NONE, the state of gaming has me not interested until I can play Death stranding 2


*SOURCES: https://www.gamesradar.com/video-game-release-dates

Level Up Your AI Game

Top 10 GCP Essentials (From the Trenches!)

Hey AI developers! Let's get real for a sec. We're in the trenches, building the future, and we need the right tools to make that happen. After countless hours of experimentations and real-world deployment on Google Cloud Platform (GCP), I've compiled my absolute must haves for super charging your workflow. These aren't just features; they're game changers!

1. Vertex AI Pipelines: Your Workflow's Best Friend!

Seriously, if you're missing out. This is where orchestration meets automation. Build, deploy, and manage your ML workflows with ease. It's like having a conductor for your AI symphony!

2. Google Kubernetes Engine (GKE): AI Deployment Powerhouse!

GKE is where the magic happens. Scale your AI models like a pro, handle complex deployments, and keep everything running smoothly. Trust me, GKE is your AI deployment powerhouse.

3. Vertex AI Workbench: Your Development Hub!

Vertex AI Workbench is the ultimate development environment, Notebooks, collaboration, and seamless integration with other GCP services - it's all there. It's the hub that makes development feel smooth and efficient.

4. Cloud Storage: Data's Safe Haven!

Reliable, scalable, and secure. Cloud Storage is where you store your precious data. It's the bedrock of any serious AI project.

5. BigQuery: Data Analysis on Steroids!

Need to analyze massive datasets? BigQuery is your weapon of choice. It's fast, powerful, and makes data exploration a breeze.

6. Cloud Functions: Serverless AI Snippets!

For those quick AI tasks or microservices, Cloud Functions are your go to. Serverless, scalable, and super efficient. It's like having AI on demand!

7. Tensor processing Units (TPUs): AI Acceleration!

When you need serious AI acceleration, TPUs deliver. They're designed for machine learning, and they'll supercharge your training times.

8. Vertex AI Model Monitoring: Keep Your Models Healthy!

Models drift. This is a fact. Vertex AI Model Monitoring helps you keep a close eye on your deployments, ensuring they stay accurate and effective.

9. Identity and Access Management (IAM): Security First!

Security is non-negotiable. IAM lets you control access to your resources with precision. Keep your data and models safe.

10. Cloud Logging and Monitoring: Insights at Your Fingertips!

Debugging, performance tracking, and real time insights - Cloud Logging and Monitoring give you the viability you need to keep your AI projects running smoothly.

My Takeaway:

These are the tools that have made a real difference in my AI development journey. They're powerful, efficient, and they just work. If you're looking to level up your AI game on GCP, these essentials are where you need to focus.

Keep innovating, keep building, and let's push the boundaries of AI together!

Youtube


Infrastructure Magic : LLMs and Vertex AI

Hey AI enthusiasts! Ever felt the thrill of building something truly groundbreaking? That's exactly how I feel diving into the world of Large Language Models (LLMs) on Google Cloud's Vertex AI. Its like wielding a superpower, but instead of capes, we're using gcloud and Terraform! Let's embark on this journey together, exploring the ins and outs of setting up the perfect AI infrastructure.

Infrastructure with Terraform

First things first, let's lay a solid foundation infrastructure as Code (IaC) is our best friend, and Terraform makes it a breeze. Here's a snippet to get us stared, creating a Vertex AI Workbench instance:

TERRAFORM: Vertex AI Workbench Instance
resource "google_vertex_ai_workbench_instance" "llm_workbench" {
  provider = google-beta
  name =  "llm-workbench-instance"
  location = "us-central"
  machine_type = "n1-standard-4"
  owner = "email@example.com"


  vm_image{
    image_family = "tf-latest-cpu"
  }

  labels = {
    env = "development"
  }
}

This code spins up a Vertex AI Workbench instance, perfect for experimenting with LLMs. Notice how were using google-beta to access the latest features!

Deploying an LLM

Now, lets get our hand dirty with gcloud. We'll deploy a pre-trained LLM using the Vertex AI API.

BASH
              
gcloud ai endpoints create llm-endpoint\
--region=us-central1

gcloud ai models deploy \
--model=text-bison@001 \
--endpoint=llm-endpoint \
--region=central1 \
--machine-type=n1-standard-4 \
--traffic-split=deployed-model=100

These commands create an endpoint and deploy the test-bison@001 model. It's amazing how quickly we go from zero to a fully deployed LLM!

Python Magic

Let's use the Vertex AI Python SDK to interact with our deployed model:

PYTHON
              
from google.cloud import aiplatform

aiplatform.init(project="my-project_id", location="us-central1")

endpoint = aiplatform.Endpoint("llm-endpoint")

response = endpoint.predict(
  instance[{"content": "Tell me a short story about a coding bird."}]
)

print(response.predictions[0]["content"])

BOOM! We're talking to an LLM. The possibilities are endless!

Keeping Things Robust

Building a robust AI system isn't just about deploying. It's about scaling and monitoring. Vertex AI makes this easy. We can use Terraform to configure autoscaling:

TERRAFORM : Autoscaling
resource "google_vertex_ai_endpoint" "llm_endpoint" {
  provider = google-beta
  name =  "llm-endpoint"
  location = "us-central"

  deployed_models {
    model = "text-bison@001"
    traffic_split {
      "0" = 100
    }
    autoscaling_config {
      min_replica_count = 1
      max_replica_count = 5 
    }
  }
}

And for monitoring, Vertex AI provides build in tools to track performance and usage.

We are in the Future Now!

Working with LLMs on Vertex AI is like unlocking a new dimension of creativity and problem solving. It's more than just tech, it's about building tools that can change the world. If you're a seasoned AI engineer or jet staring, the power of Vertex AI is with reach.

What are you building with LLMs? Share your projects and experiences! Let's learn and grow together.


Kubernetes Kung Fu

Containerizing AI Apps and Mastering GKE!

Hey Cloud Ninjas!

Ever felt like wrangling AI applications is like trying to tame a wild beast? Well, don't panic! Today, we're strapping on our containerization belts and diving headfirst into Google Kubernetes Engine (GKE) to build scalable, resilient AI applications. Think of it as giving your AI the ultimate power up!

The Secret Sauce

First, lets talk containers. It's like packing your AI application in to a neat, self contained box. Docker makes this easy. Here's a simple Dockerfile or a Python based AI app:

DOCKERFILE
                
FROM python:3.9-slim

WORKDIR /app

COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt

COPY . .

CMD ["python", "app.py"]

This Dockerfile sets up a Python environment, installs dependencies, and runs our AI application. Simple, right?

The Orchestration Maestro

Now, let's bring in the maestro: Google Kubernetes Engine (GKE). GKE lets us orchestrate our containers, ensuring our AI app scales smoothly. Here's a snippet of a Kubernetes deployment YAML:

YAML
                  
apiVersion: apps/v1
kind: Deployment
metadata:
  name: ai-app-deployment
spec:
  replicas: 3
  selector:
    matchLabels:
      app: ai-app
  template:
    metadata:
      labels:
        app: ai-app
    spec:
      containers:
      - name: ai-app-container
        image: gcr.io/your-project/ai-app-image:latest
        resources:
        request:
          cpu: "500m"
          memory: "1Gi"
        Limits: 
          cpu: "1000m"
          memory: "2Gi"

This YAML file create a deployment with three replicas of our AI application. We're also specifying resource requests and limits to keep things running smoothly.

Mastering Scalability

Here's the real magic: Kubernetes Horizontal Pod Autoscaling (HPA). This lets our AI app scale automatically based on resource usage.

BASH
kubectl autoscale deployment ai-app-deployment \
  --cpu-percent=80 \
  --min=3
  --max=10

This command sets up HPA to scale our deployment between 3 and 10 replicas, based on CPU usage. It's like giving our AI app a turbo boost!

Tips for Kubernetes Mastery

  • Use Namespaces: Organize your resources with namespaces. It's like having separate rooms in you Kubernetes house.
  • Leverage Labels and Selectors: Use labels and selectors to manage and query your resources efficiently.
  • Monitor and Alert: Setup monitoring and alerting with tools like Prometheus and Grafana to keep an eye on your cluster.
  • Embrace Infrastructure as Code: Use tools like Terraform to manage your GKE cluster.

The Power of Scalable AI

With GKE, we can build AI applications that scale effortlessly, handling any workload. It's like having an AI army at your fingertips!


SOURCE: https://gemini.google.com/app


Turbocharged XR Pipelines with Serverless Sorcery

Hey Data Explorers! Feeling that mid week slump? Let's kick it to the curb with some serious data engineering magic! We're diving into the world of streamlined data pipelines, BigQuery, Dataflow, and serverless Cloud Functions, all to power the immersive experiences of Virtual Reality (VR), Augmented Reality (AR), and Extreme Reality (XR). Imagine reducing processing times by a whopping 40% - that's the kind of boost we're talking about!

The XR Data Challenge

Building XR experiences generates massive amounts of data: sensor readings, user interactions, 3D model daa, and more. Processing this data efficiently is crucial for real-time insights and seamless user experiences.

Streamlining with Dataflow & BigQuery

Let's optimize our pipelines! Dataflow is our trusty sidekick for processing large datasets in parallel. Here's a glimpse of a Python Dataflow Pipeline:

PYTHON
              
import apache_beam as beam
from apache_beam.options.pipline_options import PipelineOptions

def process_xr_data(element):
# perform data transformations and enrichments
return element

with beam.PipeLine(option=PipelineOptions()) as p:
(p
| 'Read from Pub/Sub' >> beam.io.ReadFromPubSub(topic='projects/your-project/topics/xr-data')
| 'Process Data' >> beam.Map(process_xr-data)
| 'Write to BigQuery' >> beam.io.WriteToBigQuery(
    table='your-project:xr_dataset.xr_data_table',
    schema='your-schema_string',
    create_disposition=beam.io.BigQueryDisposition.CREATE_IF_NEEDED,
    write_disposition=beam.io.BigQueryDisposition.WRITE_APPEND) 
)

This pipeline reads data from Pub/Sub, processes it, and writes it to BigQuery. BigQuery then becomes our powerful data warehouse for analysis and visualization.

Cloud Functions for Real-Time Insights

To add real-time capabilities, let's bring in Cloud Functions. Imagine triggering a function every time a suer interacts with an AR Object:

PYTHON
from google.cloud import bigquery

def handle_ar_interaction(data, context):
  client = bigQuery.Client()
  table_id = "your-project.xr_dataset.ar_interactions"

  rows_to_insert = [
    {
      "user_id": data["user_id"],
      "object_id": data["object_id"],
      "timestamp": context.timestamp,
  }
]
errors = client.insert_rows_json(table_id, rows_to_insert)
  if error == []:
    print("New rows have been added.")
  else
    print("Encountered errors while inserting rows: {}" .format(errors))

This function inserts user interaction data into BigQuery in real-time. Serverless functions are perfect for event driven architectures!

The Impact

By optimizing our Dataflow pipelines and leveraging serverless functions, we've achieved a remarkable 40% reduction in processing time. This means faster insights, smoother user experiences, and more time for innovation in XR space.

Architecture for Immersive Experiences

Our architecture looks something like this:

  • Data Ingestion: Senor data, user interactions, and 3D model data are ingested via Pub/Sub.
  • Data Processing: Dataflow processes and transforms the data in parallel.
  • Data Storage: BigQuery stores the processed data for analysis.
  • Real-Time Insights: Cloud Functions provide real-time updates and triggers.
  • Visualization: Looker Studio or custom dashboards visualize the data for XR developers.

Mid-Week Motivation Boost

Remember, data engineering is the backbone of groundbreaking technologies like VR/AR/XR. By streamlining our pipelines and leveraging serverless tools, we're not just moving data, we're powering the future of immersive experiences.


BLOG


Archives

  1. April 2025

  2. - March 2025
  3. - Febuary 2025
  4. - January 2025
  5. - December 2024
  6. - November 2024
  7. - October 2024
  8. - September 2024
  9. - August 2024
  10. - July 2024
  11. - June 2024
  12. - May 2024


Let's Connect

Have a question or need feedback? I am easy to find.
Have questions, opportunities, need support, or just say hello!

on Trello on Slack on Jira

keeyanajones@yahoo.com


keeyanajones@gmail.com


Contact Information

Reach out to me on one of these platforms.

Payment Methods

Fiverr and Freelancer have several payment methods available, dependent on the device used, country of residence, and regulatory permissions. To see which are applicable, see the list or click on the links below.

NOTE: I currently do not accept crypto as a payment method.