Here are
91 public repositories
matching this topic...
Prebuilt binaries of OpenCV on Windows
CuPy first example computing GEMM with cuBlas, with handwritten cuda kernel and with NumPy-blas
Updated
Jun 16, 2019
Cuda
Problems of June day to day challenge in Leetcode
Updated
Jun 16, 2020
Python
Containerized Intel's Optimized Tools and Frameworks for Machine Learning and Deep Learning
MATLAB exercises for Scientific Computation.
Updated
Nov 28, 2023
MATLAB
A thin cython/python wrapper on some routines from Intel MKL
Updated
Jun 21, 2022
Python
Laboratorní cvičení z předmětu Mikroprocesorové a vestavěné systémy (IMP), třetí semestr bakalářského studia BIT na FIT VUT/BUT, ak.rok 2023/2024
This repository leverages Intel CPU's blas focused instructions to get very fast linear algebra calculations.
JAMU - Java Matrix Utilities built on top of Intel's oneAPI Math Kernel Library (oneMKL)
Updated
Apr 12, 2024
Java
Уравнение теплопроводности в трёхмере
Task 1- Subtask 2 : Measuring code performances of openblas, Intel MKL, and pthread for matrix multiplication. Plotting the boxplots using gnuplot.
Implemented a deep neural network (DNN) inference for classifying across 12 audio keywords, where the model is pre-trained and weights are provided as input, the task here is to do the inference and compute the outputs EFFICIENTLY
Updated
Jul 21, 2018
Dockerfile
Multiple Kernel Learning Model for Relating Structural and Functional Connectivity in the Brain
Updated
Feb 14, 2018
MATLAB
C++17 wrapper library for fft-related computations on CPUs and GPUs
manylinux2014 Python pkg builds
fast matrix multiplies with intel mkl
Updated
Jun 12, 2020
Jupyter Notebook
Deep Learning Framework from Google
Nim CBLAS and LAPACKE bindings for Intel MKL
Improve this page
Add a description, image, and links to the
mkl
topic page so that developers can more easily learn about it.
Curate this topic
Add this topic to your repo
To associate your repository with the
mkl
topic, visit your repo's landing page and select "manage topics."
Learn more
You can’t perform that action at this time.