Press "Enter" to skip to content

Category: Machine Learning

Kernel Methods in Python

Matthew Mayo does a bit of kernel work:

Kernel methods are a powerful class of machine learning algorithm that allow us to perform complex, non-linear transformations of data without explicitly computing the transformed feature space. These methods are particularly useful when dealing with high-dimensional data or when the relationship between features is non-linear.

Kernel methods rely on the concept of a kernel function, which computes the dot product of two vectors in a transformed feature space without explicitly performing the transformation. This is known as the kernel trick. The kernel trick allows us to work in high-dimensional spaces efficiently, making it possible to solve complex problems that would be computationally infeasible otherwise.

Read on for the pros and cons of kernel methods and a pair of techniques that use them.

Leave a Comment

Shrinking ONNX Files

Pete Warden breaks out the shrink ray:

I’ve been using the ONNX Runtime a lot recently, and while it has been a lot of fun, there are a few things I’ve missed from the TensorFlow Lite world. The biggest (no pun intended) is the lack of tools to shrink the model file size, something that’s always been essential in the mobile app world. You can quantize using the standard ONNX tools, but in my experience you’ll often run into accuracy problems because all of the calculations are done at lower precision. These are usually fixable, but require some time and effort.

Read on for Pete’s preferred alternative and a new tool to help with this.

Comments closed

Working with the Azure AI Document Service

Tomaz Kastrun continues a series on Azure AI. First up is a visual review of the Azure AI Document service:

Vision and Document services gives your apps the ability to analyze images, process documents and use technologies for optical character recognition (OCR) with combinations to machine learning.

That product has gone through a few name iterations, including Document Recognizer. But wait, there’s more!

Tomaz also takes a look at the Python SDK:

Vision and Document SDK for Python gives you extra extensibility of the services to add it to your apps.

Using Vision and Document SDK with Python, you will need to have the resource up and running (for the starters go with free pricing tier (F0)) and get the Document intelligence API Key and Endpoint address.

Click through for an example of how that works.

Comments closed

Using the Azure AI Language and Translation Python SDK

Tomaz Kastrun continues a series on Azure AI:

Using SDK options for “Language + Translation” service is

pip install azure-ai-textanalytics==5.2.0

and adding your endpoint in format like: https://yyyyy_azurehub_xxxxxxx.cognitiveservices.azure.com/

and secret to your endpoint. And you will also need the region name (e.g.: west-europe).

Once you’ve set up the necessary credentials, Tomaz shows how easy it is to call the service.

Comments closed

An Overview of the Azure AI Services Speech Service

Tomaz Kastrun has been busy with the Azure AI series. First up is an overview of Azure AI Services (nee Cognitive Services) available in the Azure AI Foundry:

In Azure AI Foundry, you can always gow to Azure AI Services, where you can create intelligent apps with different AI models. These services are simple and ready to use with relative low costs.

Then Tomaz drills into the Speech service:

In Azure AI Foundry you will find the speech playground with the vast variety of solutions to enhance and add the functionalities to your applications.

Speech service will give you capabilities to convert speech to text, realtime translations, fast transcriptions, voice assistant and others.

After that, we get an intro of Speech Studio:

Speech studio (available at URL: https://speech.microsoft.com/portal)  is a set of UI-based tools for building and integrating features from Azure AI Speech service (available in Azure portal) into your applications using no-code approach. You can also create projects by using and referencing the assets and services using  Speech SDK, the Speech CLI, or the REST APIs.

The Speech service is by no means perfect, but it’s interesting just how well it can do at detecting languages (one set of functionality) and translating arbitrary audio from one language to another (via a different call).

Comments closed

A Survey of Predictive Analytics Techniques

Akmal Chaudhri tries a bunch of things:

In this short article, we’ll explore loan approvals using a variety of tools and techniques. We’ll begin by analyzing loan data and applying Logistic Regression to predict loan outcomes. Building on this, we’ll integrate BERT for Natural Language Processing to enhance prediction accuracy. To interpret the predictions, we’ll use SHAP and LIME explanation frameworks, providing insights into feature importance and model behavior. Finally, we’ll explore the potential of Natural Language Processing through LangChain to automate loan predictions, using the power of conversational AI.

Click through for the notebook, as well as an overview of what the notebook includes. I don’t particularly like word clouds as the “solution” in the BERT example, though without real data to perform any sort of NLP, there’s not much you can meaningfully do.

Comments closed

Referencing a Microsoft Fabric ML Model from another Workspace

Sandeep Pawar crosses workspaces:

I have written a couple of blogs about working with ML models in Microsoft Fabric. Creating experiments and logging and scoring models in Fabric is very easy, thanks to the built-in MLflow integration. However, the Fabric Data Science experience has one limitation. There are no model endpoints yet, and you cannot load a model from another workspace because the model URI, unlike in Databricks, does not reference a workspace. If you use MLFlowTransformer as shown in this blog, only the model from the workspace where the notebook is hosted is loaded. However, there is a workaround.

Read on for that workaround, as well as the core limitation associated with it.

Comments closed