{"id":979,"date":"2024-04-12T13:44:14","date_gmt":"2024-04-12T12:44:14","guid":{"rendered":"https:\/\/shyuxinpc.com\/?p=979"},"modified":"2024-04-13T07:49:00","modified_gmt":"2024-04-13T06:49:00","slug":"intel-select-solutions-for-ai-inference","status":"publish","type":"post","link":"https:\/\/shyuxinpc.com\/cn\/intel-select-solutions-for-ai-inference\/","title":{"rendered":"\u9762\u5411\u4eba\u5de5\u667a\u80fd\u63a8\u7406\u7684 \u82f1\u7279\u5c14\u00ae \u7cbe\u9009\u89e3\u51b3\u65b9\u6848"},"content":{"rendered":"\t\t<div data-elementor-type=\"wp-post\" data-elementor-id=\"979\" class=\"elementor elementor-979\" data-elementor-post-type=\"post\">\n\t\t\t\t<div class=\"elementor-element elementor-element-abf868a e-flex e-con-boxed elementor-invisible e-con e-parent\" data-id=\"abf868a\" data-element_type=\"container\" data-e-type=\"container\" data-settings=\"{&quot;animation&quot;:&quot;fadeInLeft&quot;}\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-99e31f4 elementor-widget elementor-widget-text-editor\" data-id=\"99e31f4\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>More and more businesses are hoping to leverage Artificial Intelligence (AI) to increase revenue, improve efficiency, and drive product innovation. It is particularly noteworthy that AI use cases based on Deep Learning (DL) technology can bring effective and practical insights; some of these use cases can advance progress in various industries, such as:<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-fff2add e-flex e-con-boxed elementor-invisible e-con e-parent\" data-id=\"fff2add\" data-element_type=\"container\" data-e-type=\"container\" data-settings=\"{&quot;animation&quot;:&quot;fadeInLeft&quot;}\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-be6d2f9 e-con-full e-flex e-con e-child\" data-id=\"be6d2f9\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-e1f46f4 solution-icon-001 elementor-position-top elementor-widget elementor-widget-image-box\" data-id=\"e1f46f4\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image-box.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<div class=\"elementor-image-box-wrapper\"><figure class=\"elementor-image-box-img\"><img fetchpriority=\"high\" decoding=\"async\" width=\"500\" height=\"500\" src=\"https:\/\/shyuxinpc.com\/wp-content\/uploads\/2024\/04\/solution-icon-001.png\" class=\"attachment-full size-full wp-image-217\" alt=\"\" srcset=\"https:\/\/shyuxinpc.com\/wp-content\/uploads\/2024\/04\/solution-icon-001.png 500w, https:\/\/shyuxinpc.com\/wp-content\/uploads\/2024\/04\/solution-icon-001-300x300.png 300w, https:\/\/shyuxinpc.com\/wp-content\/uploads\/2024\/04\/solution-icon-001-150x150.png 150w\" sizes=\"(max-width: 500px) 100vw, 500px\" \/><\/figure><div class=\"elementor-image-box-content\"><h3 class=\"elementor-image-box-title\">Image Classification<\/h3><p class=\"elementor-image-box-description\">Can be used for concept allocation, such as facial emotion classification.<\/p><\/div><\/div>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-5547e21 e-con-full e-flex e-con e-child\" data-id=\"5547e21\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-4962edf solution-icon-001 elementor-position-top elementor-widget elementor-widget-image-box\" data-id=\"4962edf\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image-box.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<div class=\"elementor-image-box-wrapper\"><figure class=\"elementor-image-box-img\"><img decoding=\"async\" width=\"500\" height=\"500\" src=\"https:\/\/shyuxinpc.com\/wp-content\/uploads\/2024\/04\/solution-icon-002.png\" class=\"attachment-full size-full wp-image-218\" alt=\"\" srcset=\"https:\/\/shyuxinpc.com\/wp-content\/uploads\/2024\/04\/solution-icon-002.png 500w, https:\/\/shyuxinpc.com\/wp-content\/uploads\/2024\/04\/solution-icon-002-300x300.png 300w, https:\/\/shyuxinpc.com\/wp-content\/uploads\/2024\/04\/solution-icon-002-150x150.png 150w\" sizes=\"(max-width: 500px) 100vw, 500px\" \/><\/figure><div class=\"elementor-image-box-content\"><h3 class=\"elementor-image-box-title\">Object Detection<\/h3><p class=\"elementor-image-box-description\">Can be used for object localization in autonomous driving technology.<\/p><\/div><\/div>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-009d503 e-con-full e-flex e-con e-child\" data-id=\"009d503\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-e935174 solution-icon-001 elementor-position-top elementor-widget elementor-widget-image-box\" data-id=\"e935174\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image-box.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<div class=\"elementor-image-box-wrapper\"><figure class=\"elementor-image-box-img\"><img decoding=\"async\" width=\"500\" height=\"500\" src=\"https:\/\/shyuxinpc.com\/wp-content\/uploads\/2024\/04\/solution-icon-003.png\" class=\"attachment-full size-full wp-image-219\" alt=\"\" srcset=\"https:\/\/shyuxinpc.com\/wp-content\/uploads\/2024\/04\/solution-icon-003.png 500w, https:\/\/shyuxinpc.com\/wp-content\/uploads\/2024\/04\/solution-icon-003-300x300.png 300w, https:\/\/shyuxinpc.com\/wp-content\/uploads\/2024\/04\/solution-icon-003-150x150.png 150w\" sizes=\"(max-width: 500px) 100vw, 500px\" \/><\/figure><div class=\"elementor-image-box-content\"><h3 class=\"elementor-image-box-title\">Image Segmentation<\/h3><p class=\"elementor-image-box-description\">Can be used to outline organ contours in a patient's magnetic resonance imaging (MRI).\nNatural language processing: Can be used for text analysis or translation.<\/p><\/div><\/div>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-55422b7 e-con-full e-flex e-con e-child\" data-id=\"55422b7\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-9d4b744 solution-icon-001 elementor-position-top elementor-widget elementor-widget-image-box\" data-id=\"9d4b744\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image-box.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<div class=\"elementor-image-box-wrapper\"><figure class=\"elementor-image-box-img\"><img loading=\"lazy\" decoding=\"async\" width=\"500\" height=\"500\" src=\"https:\/\/shyuxinpc.com\/wp-content\/uploads\/2024\/04\/solution-icon-004.png\" class=\"attachment-full size-full wp-image-220\" alt=\"\" srcset=\"https:\/\/shyuxinpc.com\/wp-content\/uploads\/2024\/04\/solution-icon-004.png 500w, https:\/\/shyuxinpc.com\/wp-content\/uploads\/2024\/04\/solution-icon-004-300x300.png 300w, https:\/\/shyuxinpc.com\/wp-content\/uploads\/2024\/04\/solution-icon-004-150x150.png 150w\" sizes=\"(max-width: 500px) 100vw, 500px\" \/><\/figure><div class=\"elementor-image-box-content\"><h3 class=\"elementor-image-box-title\">Natural Language Processing<\/h3><p class=\"elementor-image-box-description\">Can be used for text analysis or translation.<\/p><\/div><\/div>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-51a85d3 e-con-full e-flex e-con e-child\" data-id=\"51a85d3\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-34bff1f solution-icon-001 elementor-position-top elementor-widget elementor-widget-image-box\" data-id=\"34bff1f\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image-box.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<div class=\"elementor-image-box-wrapper\"><figure class=\"elementor-image-box-img\"><img loading=\"lazy\" decoding=\"async\" width=\"500\" height=\"500\" src=\"https:\/\/shyuxinpc.com\/wp-content\/uploads\/2024\/04\/solution-icon-005.png\" class=\"attachment-full size-full wp-image-221\" alt=\"\" srcset=\"https:\/\/shyuxinpc.com\/wp-content\/uploads\/2024\/04\/solution-icon-005.png 500w, https:\/\/shyuxinpc.com\/wp-content\/uploads\/2024\/04\/solution-icon-005-300x300.png 300w, https:\/\/shyuxinpc.com\/wp-content\/uploads\/2024\/04\/solution-icon-005-150x150.png 150w\" sizes=\"(max-width: 500px) 100vw, 500px\" \/><\/figure><div class=\"elementor-image-box-content\"><h3 class=\"elementor-image-box-title\">Recommendation Systems<\/h3><p class=\"elementor-image-box-description\">Can be used to predict customer preferences in online stores or recommend higher-value products or services.<\/p><\/div><\/div>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-531927f e-flex e-con-boxed elementor-invisible e-con e-parent\" data-id=\"531927f\" data-element_type=\"container\" data-e-type=\"container\" data-settings=\"{&quot;animation&quot;:&quot;fadeInLeft&quot;}\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-79e4ff6 elementor-widget elementor-widget-text-editor\" data-id=\"79e4ff6\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>These use cases are just the beginning. As businesses integrate AI into their operations, they will discover new ways to apply artificial intelligence. However, the commercial value of all AI use cases depends on the inference speed of models trained by deep neural networks. The resource scale required to support inference on deep learning models may be very large, often requiring businesses to upgrade hardware to achieve the performance and speed they need. However, many customers prefer to extend their existing infrastructure rather than purchase new hardware for a single purpose. Your IT department is already very familiar with Intel\u00ae hardware architecture, whose flexible performance makes your IT investments more efficient. Intel\u00ae Select Solutions for AI Inference is a &#8220;one-stop&#8221; platform that provides pre-configured, optimized, and validated solutions, enabling low-latency, high-throughput inference on CPUs without the need for additional accelerator cards.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-66d01b8 e-flex e-con-boxed elementor-invisible e-con e-parent\" data-id=\"66d01b8\" data-element_type=\"container\" data-e-type=\"container\" data-settings=\"{&quot;animation&quot;:&quot;fadeInLeft&quot;}\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-3013ff1 elementor-widget elementor-widget-heading\" data-id=\"3013ff1\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Intel\u00ae Select Solutions for AI Inference<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-799ac05 elementor-widget elementor-widget-text-editor\" data-id=\"799ac05\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>Intel\u00ae Select Solutions for AI Inference can help you get started quickly, leveraging solutions based on validated Intel\u00ae architecture to deploy efficient AI inference algorithms, thereby accelerating innovation and product launches. To speed up the inference and launch of AI applications, Intel\u00ae Select Solutions for AI Inference combine various Intel and third-party software and hardware technologies.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-0993843 e-flex e-con-boxed elementor-invisible e-con e-parent\" data-id=\"0993843\" data-element_type=\"container\" data-e-type=\"container\" data-settings=\"{&quot;animation&quot;:&quot;fadeInLeft&quot;}\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-e3b5670 elementor-widget elementor-widget-heading\" data-id=\"e3b5670\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Software Selection<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-c970e24 elementor-widget elementor-widget-text-editor\" data-id=\"c970e24\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>The software used in Intel\u00ae Select Solutions for AI Inference includes developer tools and management tools to assist with AI inference in production environments.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-e9101b3 e-flex e-con-boxed elementor-invisible e-con e-parent\" data-id=\"e9101b3\" data-element_type=\"container\" data-e-type=\"container\" data-settings=\"{&quot;animation&quot;:&quot;fadeInLeft&quot;}\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-387d5ad elementor-widget elementor-widget-heading\" data-id=\"387d5ad\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Intel\u00ae Distribution of OpenVINO\u2122 Toolkit<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-d8d7fc1 elementor-widget elementor-widget-text-editor\" data-id=\"d8d7fc1\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>The Intel\u00ae Distribution of OpenVINO\u2122 Toolkit, also known as the Intel\u00ae Distribution of Open Visual Inference and Neural Network Optimization Toolkit, is a developer suite designed to accelerate the deployment of high-performance artificial intelligence and deep learning inference. This toolkit optimizes models trained on various frameworks for multiple Intel\u00ae hardware options to provide outstanding performance deployment. The Deep Learning Workbench within the toolkit quantizes models to lower precision, converting models that typically use larger 32-bit floating-point numbers (commonly used for training and consuming more memory) to 8-bit integers to optimize memory usage and performance. Converting floating-point numbers to integers significantly improves AI inference speed while maintaining almost the same accuracy. The toolkit can convert and execute models built in various frameworks, including TensorFlow, MXNet, PyTorch, Kaldi, and any framework supported by the Open Neural Network Exchange (ONNX) ecosystem. Additionally, users can access pretrained public models, speeding up development on Intel\u00ae processors and optimizing image processing pipelines without the need to search for or train models themselves.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-99d7dde e-flex e-con-boxed elementor-invisible e-con e-parent\" data-id=\"99d7dde\" data-element_type=\"container\" data-e-type=\"container\" data-settings=\"{&quot;animation&quot;:&quot;fadeInLeft&quot;}\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-d8fc19e elementor-widget elementor-widget-heading\" data-id=\"d8fc19e\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Deep Learning Reference Stack<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-40f84a0 elementor-widget elementor-widget-text-editor\" data-id=\"40f84a0\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>Intel\u00ae Select Solutions for AI Inference come with the Deep Learning Reference Stack (DLRS), an integrated high-performance open-source software stack optimized for Intel\u00ae Xeon\u00ae Scalable processors and packaged within a convenient Docker container. DLRS is pre-validated and well-configured, containing the necessary libraries and software components, thereby reducing the complexity of integrating multiple software components for AI in production environments. The stack also includes highly-tuned containers for mainstream deep learning frameworks like TensorFlow and PyTorch, as well as the Intel\u00ae Distribution of OpenVINO\u2122 Toolkit. This open-source community version ensures that AI developers have easy access to all the features and capabilities of Intel\u00ae platforms.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-c63398b e-flex e-con-boxed elementor-invisible e-con e-parent\" data-id=\"c63398b\" data-element_type=\"container\" data-e-type=\"container\" data-settings=\"{&quot;animation&quot;:&quot;fadeInLeft&quot;}\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-9364131 elementor-widget elementor-widget-heading\" data-id=\"9364131\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Kubeflow and Seldon Core<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-ccdc733 elementor-widget elementor-widget-text-editor\" data-id=\"ccdc733\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>As enterprises and organizations accumulate experience deploying inference models in production environments, industry consensus on a set of best practices, known as &#8220;MLOps,&#8221; similar to &#8220;DevOps&#8221; software development practices, has gradually emerged. To assist teams in applying MLOps, Intel\u00ae Select Solutions for AI Inference utilize Kubeflow. With Kubeflow, teams can smoothly roll out new versions of models with zero downtime. Kubeflow exports trained models to Kubernetes using supported model-serving backends (such as TensorFlow Serving). Model deployment can then utilize canary testing or shadow deployment for parallel verification of new and old versions. In case of issues, besides tracing, teams can use model and data versioning to simplify root cause analysis.<\/p><p>To maintain responsive services as demand increases, Intel\u00ae Select Solutions for AI Inference provide load balancing capabilities, automatically partitioning inference across nodes to available instances that can serve requests. Multi-tenancy support allows for different models, increasing hardware utilization. Lastly, to expedite processing of inference requests between servers running AI inference and endpoints needing AI insights, Intel\u00ae Select Solutions for AI Inference can use Seldon Core to help manage inference pipelines. Kubeflow integrates with Seldon Core to deploy deep learning models on Kubernetes and use the Kubernetes API to manage containers deployed in inference pipelines.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-6404730 e-flex e-con-boxed elementor-invisible e-con e-parent\" data-id=\"6404730\" data-element_type=\"container\" data-e-type=\"container\" data-settings=\"{&quot;animation&quot;:&quot;fadeInLeft&quot;}\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-a6ebbfe elementor-widget elementor-widget-heading\" data-id=\"a6ebbfe\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Hardware Selection<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-3121e9e elementor-widget elementor-widget-text-editor\" data-id=\"3121e9e\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>Intel\u00ae Select Solutions for AI Inference combine second-generation Intel\u00ae Xeon\u00ae Scalable processors, Intel\u00ae Optane\u2122 Solid State Drives (SSDs), Intel\u00ae 3D NAND SSDs, and Intel\u00ae Ethernet 700 Series, allowing your enterprise to rapidly deploy production-grade AI infrastructure on a performance-optimized platform, providing large memory capacities for demanding applications and workloads.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-b908308 e-flex e-con-boxed elementor-invisible e-con e-parent\" data-id=\"b908308\" data-element_type=\"container\" data-e-type=\"container\" data-settings=\"{&quot;animation&quot;:&quot;fadeInLeft&quot;}\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-e7a0dd2 elementor-widget elementor-widget-heading\" data-id=\"e7a0dd2\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Second-generation Intel\u00ae Xeon\u00ae Scalable Processors<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-9f19598 elementor-widget elementor-widget-text-editor\" data-id=\"9f19598\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>Intel\u00ae Select Solutions for AI Inference feature the performance and capabilities of second-generation Intel\u00ae Xeon\u00ae Scalable processors. For &#8220;baseline&#8221; configurations, the Intel\u00ae Xeon\u00ae Gold 6248 processor achieves an excellent balance between price, performance, and integrated technologies, enhancing inference performance and efficiency for AI models. The &#8220;enhanced&#8221; configuration utilizes the Intel\u00ae Xeon\u00ae Platinum 8268 processor, designed specifically to achieve faster AI inference. Additionally, higher-tier processors are also available in either configuration. The second-generation Intel\u00ae Xeon\u00ae Scalable processors include Intel\u00ae Deep Learning Boost technology, a suite of acceleration features that improve AI inference performance through specialized Vector Neural Network Instructions (VNNI). This instruction set enables deep learning calculations that previously required three separate instructions to be completed with a single instruction.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-f0906c7 e-flex e-con-boxed elementor-invisible e-con e-parent\" data-id=\"f0906c7\" data-element_type=\"container\" data-e-type=\"container\" data-settings=\"{&quot;animation&quot;:&quot;fadeInLeft&quot;}\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-6230178 elementor-widget elementor-widget-heading\" data-id=\"6230178\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Intel\u00ae Optane\u2122 Technology<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-1cb2a51 elementor-widget elementor-widget-text-editor\" data-id=\"1cb2a51\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>Intel\u00ae Optane\u2122 Technology bridges the critical gap between storage and memory layers, enabling data centers to access data faster. This technology disrupts the memory and storage layers, providing persistent memory, large memory pools, high-speed caches, and storage across a variety of different products and solutions.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-3752ee6 e-flex e-con-boxed elementor-invisible e-con e-parent\" data-id=\"3752ee6\" data-element_type=\"container\" data-e-type=\"container\" data-settings=\"{&quot;animation&quot;:&quot;fadeInLeft&quot;}\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-394f422 elementor-widget elementor-widget-heading\" data-id=\"394f422\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">State Drives<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-82fc631 elementor-widget elementor-widget-text-editor\" data-id=\"82fc631\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>When the cache layer runs on high-speed solid-state drives (SSDs) with low latency and high durability, AI inference can fully leverage its performance. Deploying high-performance SSDs for the cache layer, rather than mainstream Serial ATA (SATA) SSDs, offers significant benefits for high-performance workloads. In Intel\u00ae Select Solutions, the cache layer utilizes Intel\u00ae Optane\u2122 Solid State Drives. Intel\u00ae Optane\u2122 Solid State Drives provide high input\/output operations per second (IOPS) at a competitive cost, with low latency and high durability, along with up to 30 drive writes per day (DWPD), making them an ideal choice for write-intensive caching functions. The capacity layer utilizes Intel\u00ae 3D NAND Solid State Drives, which offer outstanding read performance, along with data integrity, performance consistency, and drive reliability.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-8548e19 e-flex e-con-boxed elementor-invisible e-con e-parent\" data-id=\"8548e19\" data-element_type=\"container\" data-e-type=\"container\" data-settings=\"{&quot;animation&quot;:&quot;fadeInLeft&quot;}\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-c1b3f71 elementor-widget elementor-widget-heading\" data-id=\"c1b3f71\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">25 Gb Ethernet<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-da612f3 elementor-widget elementor-widget-text-editor\" data-id=\"da612f3\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>The 25 Gb Intel\u00ae Ethernet 700 Series network adapters enhance the performance of Intel\u00ae Select Solutions for AI Inference. Compared to using 1 Gb Ethernet (GbE) adapters and Intel\u00ae SSD DC S4500, using 25 Gb Ethernet adapters with second-generation Intel\u00ae Xeon\u00ae Platinum processors and Intel\u00ae SSD DC P4600 can provide up to 2.5 times higher performance. The Intel\u00ae Ethernet 700 Series offers validated performance, with extensive interoperability to meet high-quality thresholds for data resilience and service reliability. All Intel\u00ae Ethernet products provide global pre-sales and post-sales support, along with limited warranty coverage throughout the product lifecycle.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-a3e33d0 e-flex e-con-boxed elementor-invisible e-con e-parent\" data-id=\"a3e33d0\" data-element_type=\"container\" data-e-type=\"container\" data-settings=\"{&quot;animation&quot;:&quot;fadeInLeft&quot;}\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-2d2b70e elementor-widget elementor-widget-heading\" data-id=\"2d2b70e\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Performance Verified by Benchmark Testing<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-7afd27e elementor-widget elementor-widget-text-editor\" data-id=\"7afd27e\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>All Intel\u00ae Select Solutions undergo benchmark testing to meet pre-defined levels of functionality optimized for specific workloads. AI inference is becoming an integral part of various workloads in data centers, network edges, and clouds. Therefore, Intel chooses to utilize standard deep learning benchmarking methods and simulate real-world scenarios for measurement and benchmarking.<\/p><p>In standard benchmark testing, the number of images processed per second (throughput) is measured on a pre-trained deep residual neural network (ResNet 50 v1). This neural network is closely related to deep learning use cases such as image classification, localization, and detection widely used with TensorFlow, PyTorch, and the OpenVINO\u2122 toolkit using synthetic data.<\/p><p>To simulate real-world scenarios, multiple clients are initiated to simulate multiple request streams. These clients send images from external client systems to the server for inference. On the server side, inbound requests are load balanced by Istio. The requests are then sent to multiple instances of a serviceable object containing a pipeline of preprocessing, prediction, and post-processing steps run through Seldon Core. Inference is completed using the optimized DLRS container image Model Server from the OpenVINO\u2122 toolkit. After passing through the pipeline, the inference results are returned to the requesting client. Throughput and latency measured during this process help ensure that this test configuration is sufficient to support inference scale in production environments.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-3f3d94a e-flex e-con-boxed elementor-invisible e-con e-parent\" data-id=\"3f3d94a\" data-element_type=\"container\" data-e-type=\"container\" data-settings=\"{&quot;animation&quot;:&quot;fadeInLeft&quot;}\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-7b499ca elementor-widget elementor-widget-heading\" data-id=\"7b499ca\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Baseline Configuration and Enhanced Configuration<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-bf87c9f elementor-widget elementor-widget-text-editor\" data-id=\"bf87c9f\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>We present two reference configurations (&#8220;baseline configuration&#8221; and &#8220;enhanced configuration&#8221;) to showcase Intel\u00ae Select Solutions for AI Inference. Both configurations are validated and offer excellent performance. These configurations are specially designed and pre-tested to provide outstanding value, performance, security, and user experience. Ultimately, end customers can collaborate with system builders, system integrators, or solution and service providers to customize these configurations based on the needs and budgets of their enterprises and organizations.<\/p><p>The &#8220;baseline configuration&#8221; offers excellent value for money and is optimized for AI inference workloads. The &#8220;enhanced configuration&#8221; utilizes higher-tier models of Intel\u00ae Xeon\u00ae Scalable processors than the &#8220;baseline configuration&#8221; and doubles the memory. Table 1 provides detailed information on these two configurations.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-d89f5c5 e-flex e-con-boxed elementor-invisible e-con e-parent\" data-id=\"d89f5c5\" data-element_type=\"container\" data-e-type=\"container\" data-settings=\"{&quot;animation&quot;:&quot;fadeInLeft&quot;}\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-067fbf2 elementor-widget elementor-widget-heading\" data-id=\"067fbf2\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">The basic configuration and enhanced configuration of the Intel\u00ae Select Solutions Version 2 for AI inference<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-d1163e1 table-style-001 elementor-widget elementor-widget-text-editor\" data-id=\"d1163e1\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<table dir=\"ltr\" style=\"table-layout: fixed; font-size: 10pt; font-family: Arial; width: 0px; border-collapse: collapse; border: none;\" border=\"1\" cellspacing=\"0\" cellpadding=\"0\" data-sheets-root=\"1\"><colgroup> <col width=\"275\" \/> <col width=\"420\" \/> <col width=\"459\" \/><\/colgroup><tbody><tr style=\"height: 21px;\"><td style=\"border: 1px solid #e3e3e3; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffb217; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d; text-align: center;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;Configuration Item&quot;}\"><strong>Configuration Item<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffb217; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d; text-align: center;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;Basic Configuration&quot;}\"><strong>Basic Configuration<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #ececec #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffb217; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d; text-align: center;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;Enhanced Configuration&quot;}\"><strong>Enhanced Configuration<\/strong><\/td><\/tr><tr style=\"height: 21px;\"><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;Processor&quot;}\"><strong>Processor<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;2 x Intel\u00ae Xeon\u00ae Gold 6248 Processor, 2.5 GHz, 20 Cores, 40 Threads (or higher configuration)&quot;}\">2 x Intel\u00ae Xeon\u00ae Gold 6248 Processor, 2.5 GHz, 20 Cores, 40 Threads (or higher configuration)<\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;2 x Intel\u00ae Xeon\u00ae Platinum 8268 Processor, 2.90 GHz, 24 Cores, 48 Threads (or higher configuration)&quot;}\">2 x Intel\u00ae Xeon\u00ae Platinum 8268 Processor, 2.90 GHz, 24 Cores, 48 Threads (or higher configuration)<\/td><\/tr><tr style=\"height: 21px;\"><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;Memory&quot;}\"><strong>Memory<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;192 GB or more (12 x 16 GB 2,666 MHz DDR4 ECC RDIMM)&quot;}\">192 GB or more (12 x 16 GB 2,666 MHz DDR4 ECC RDIMM)<\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;384 GB (12 x 32 GB 2,934 MHz DDR4 ECC RDIMM)&quot;}\">384 GB (12 x 32 GB 2,934 MHz DDR4 ECC RDIMM)<\/td><\/tr><tr style=\"height: 21px;\"><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;Storage (Boot Disk)&quot;}\"><strong>Storage (Boot Disk)<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;1 x 256 GB Intel\u00ae SSD DC P4101 (M.2 80 mm PCIe 3.0 x4 NVMe)&quot;}\">1 x 256 GB Intel\u00ae SSD DC P4101 (M.2 80 mm PCIe 3.0 x4 NVMe)<\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;1 x 256 GB Intel\u00ae SSD DC P4101 (M.2 80 mm PCIe 3.0 x4 NVMe)&quot;}\">1 x 256 GB Intel\u00ae SSD DC P4101 (M.2 80 mm PCIe 3.0 x4 NVMe)<\/td><\/tr><tr style=\"height: 21px;\"><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;Storage (Cache)&quot;}\"><strong>Storage (Cache)<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;1 x 375 GB Intel\u00ae Optane\u2122 SSD DC P4800X, featuring Intel\u00ae Memory Drive Technology&quot;}\">1 x 375 GB Intel\u00ae Optane\u2122 SSD DC P4800X, featuring Intel\u00ae Memory Drive Technology<\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;1 x 375 GB Intel\u00ae Optane\u2122 SSD DC P4800X, featuring Intel\u00ae Memory Drive Technology&quot;}\">1 x 375 GB Intel\u00ae Optane\u2122 SSD DC P4800X, featuring Intel\u00ae Memory Drive Technology<\/td><\/tr><tr style=\"height: 21px;\"><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;Storage (Capacity)&quot;}\"><strong>Storage (Capacity)<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;1 x 2.0 TB Intel\u00ae SSD DC P4510 (2.5-inch PCIe Intel\u00ae Optane\u2122 SSD)&quot;}\">1 x 2.0 TB Intel\u00ae SSD DC P4510 (2.5-inch PCIe Intel\u00ae Optane\u2122 SSD)<\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;1 x 2.0 TB Intel\u00ae SSD DC P4510 (2.5-inch PCIe Intel\u00ae Optane\u2122 SSD)&quot;}\">1 x 2.0 TB Intel\u00ae SSD DC P4510 (2.5-inch PCIe Intel\u00ae Optane\u2122 SSD)<\/td><\/tr><tr style=\"height: 21px;\"><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;Data Network&quot;}\"><strong>Data Network<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;1 x Dual-port 25\/10\/1 GbE Intel\u00ae Ethernet Network Adapter XXV710-DA2 (or higher model)&quot;}\">1 x Dual-port 25\/10\/1 GbE Intel\u00ae Ethernet Network Adapter XXV710-DA2 (or higher model)<\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;1 x Dual-port 25\/10\/1 GbE Intel\u00ae Ethernet CNA XXV710-DA2 SFP28 (or higher model)&quot;}\">1 x Dual-port 25\/10\/1 GbE Intel\u00ae Ethernet CNA XXV710-DA2 SFP28 (or higher model)<\/td><\/tr><tr style=\"height: 21px;\"><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;Software&quot;}\"><strong>Software<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;[Not specified]&quot;}\">[Not specified]<\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;[Not specified]&quot;}\">[Not specified]<\/td><\/tr><tr style=\"height: 21px;\"><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;CentOS&quot;}\"><strong>CentOS<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;7.6.1810&quot;}\">7.6.1810<\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;7.6.1810&quot;}\">7.6.1810<\/td><\/tr><tr style=\"height: 21px;\"><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;Kernel&quot;}\"><strong>Kernel<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;3.10.0-957.el7.86_64&quot;}\">3.10.0-957.el7.86_64<\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;3.10.0-957.el7.86_64&quot;}\">3.10.0-957.el7.86_64<\/td><\/tr><tr style=\"height: 21px;\"><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;Intel\u00ae Distribution of OpenVINO\u2122 Toolkit&quot;}\"><strong>Intel\u00ae Distribution of OpenVINO\u2122 Toolkit<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:3,&quot;3&quot;:2021.2}\">2021.2<\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:3,&quot;3&quot;:2021.2}\">2021.2<\/td><\/tr><tr style=\"height: 21px;\"><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;OpenVINO\u2122 Model Server&quot;}\"><strong>OpenVINO\u2122 Model Server<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:3,&quot;3&quot;:2019.3}\">2019.3<\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:3,&quot;3&quot;:2019.3}\">2019.3<\/td><\/tr><tr style=\"height: 21px;\"><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;TensorFlow&quot;}\"><strong>TensorFlow<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;2.4.0&quot;}\">2.4.0<\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;2.4.0&quot;}\">2.4.0<\/td><\/tr><tr style=\"height: 21px;\"><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;PyTorch&quot;}\"><strong>PyTorch<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;1.8.0&quot;}\">1.8.0<\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;1.8.0&quot;}\">1.8.0<\/td><\/tr><tr style=\"height: 21px;\"><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;MXNet&quot;}\"><strong>MXNet<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;1.3.1&quot;}\">1.3.1<\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;1.3.1&quot;}\">1.3.1<\/td><\/tr><tr style=\"height: 21px;\"><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;Intel\u00ae Distribution Python&quot;}\"><strong>Intel\u00ae Distribution Python<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;2019 Update 1&quot;}\">2019 Update 1<\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;2019 Update 1&quot;}\">2019 Update 1<\/td><\/tr><tr style=\"height: 21px;\"><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;Intel\u00ae Math Kernel Library for Deep Neural Networks (MKL-DNN)&quot;}\"><strong>Intel\u00ae Math Kernel Library for Deep Neural Networks (MKL-DNN)<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;2019.3 (implied by OpenVINO)&quot;}\">2019.3 (implied by OpenVINO)<\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;2019.3 (implied by OpenVINO)&quot;}\">2019.3 (implied by OpenVINO)<\/td><\/tr><tr style=\"height: 21px;\"><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;Deep Learning Reference Stack (DLRS)&quot;}\"><strong>Deep Learning Reference Stack (DLRS)<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;v0.5.1&quot;}\">v0.5.1<\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;v0.5.1&quot;}\">v0.5.1<\/td><\/tr><tr style=\"height: 21px;\"><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;Source to Image&quot;}\"><strong>Source to Image<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;1.2.0&quot;}\">1.2.0<\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;1.2.0&quot;}\">1.2.0<\/td><\/tr><tr style=\"height: 21px;\"><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;Docker&quot;}\"><strong>Docker<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:3,&quot;3&quot;:18.09}\">18.09<\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:3,&quot;3&quot;:18.09}\">18.09<\/td><\/tr><tr style=\"height: 21px;\"><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;Kubernetes&quot;}\"><strong>Kubernetes<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;v1.15.1&quot;}\">v1.15.1<\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;v1.15.1&quot;}\">v1.15.1<\/td><\/tr><tr style=\"height: 21px;\"><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;Kubeflow&quot;}\"><strong>Kubeflow<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;1.0.1&quot;}\">1.0.1<\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;1.0.1&quot;}\">1.0.1<\/td><\/tr><tr style=\"height: 21px;\"><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;Helm&quot;}\"><strong>Helm<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:3,&quot;3&quot;:3.2}\">3.2<\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:3,&quot;3&quot;:3.2}\">3.2<\/td><\/tr><tr style=\"height: 21px;\"><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;Seldon Core&quot;}\"><strong>Seldon Core<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;1.0.1&quot;}\">1.0.1<\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;1.0.1&quot;}\">1.0.1<\/td><\/tr><tr style=\"height: 21px;\"><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;Ceph&quot;}\"><strong>Ceph<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;v14.2.7&quot;}\">v14.2.7<\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;v14.2.7&quot;}\">v14.2.7<\/td><\/tr><tr style=\"height: 21px;\"><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;Min.io (Rook v1.0)&quot;}\"><strong>Min.io (Rook v1.0)<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;1.2.7&quot;}\">1.2.7<\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;1.2.7&quot;}\">1.2.7<\/td><\/tr><tr style=\"height: 21px;\"><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;Rook&quot;}\"><strong>Rook<\/strong><\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;1.2.7&quot;}\">1.2.7<\/td><td style=\"border-width: 1px; border-style: solid; border-color: #cccccc #e3e3e3 #e3e3e3 #cccccc; border-image: initial; overflow: hidden; padding: 2px 3px; vertical-align: bottom; background-color: #ffffff; font-family: Arial; font-weight: normal; white-space: normal; overflow-wrap: break-word; color: #0d0d0d;\" data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;1.2.7&quot;}\">1.2.7<\/td><\/tr><\/tbody><\/table>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-808dc2d e-flex e-con-boxed elementor-invisible e-con e-parent\" data-id=\"808dc2d\" data-element_type=\"container\" data-e-type=\"container\" data-settings=\"{&quot;animation&quot;:&quot;fadeInLeft&quot;}\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-482abfe elementor-widget elementor-widget-text-editor\" data-id=\"482abfe\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>The technical choices of Intel\u00ae Select Solutions for AI inference include not only a robust Intel\u00ae hardware foundation but also other Intel\u00ae technologies that further enhance performance and reliability:<\/p><ul><li>Intel\u00ae Advanced Vector Extensions 512 (Intel\u00ae AVX-512): A 512-bit instruction set that boosts performance for demanding workloads and use cases such as AI inference.<\/li><li>Intel\u00ae Deep Learning Acceleration: A suite of acceleration features introduced with the second-generation Intel\u00ae Xeon\u00ae Scalable processors, significantly improving performance for inference applications built using advanced deep learning frameworks like PyTorch, TensorFlow, MXNet, PaddlePaddle, and Caffe. The foundation of Intel\u00ae Deep Learning Acceleration technology is VNNI, a specialized instruction set that performs deep learning calculations with a single instruction, replacing three separate instructions previously required.<\/li><li>Intel\u00ae Distribution of OpenVINO\u2122 Toolkit: A free software suite that helps developers and data scientists accelerate AI workloads and simplify deep learning inference and deployment from the network edge to the cloud.<\/li><li>Intel\u00ae Math Kernel Library (Intel\u00ae MKL): This library contains optimized implementations of mainstream mathematical operations for Intel\u00ae hardware, enabling applications to fully utilize the Intel\u00ae AVX-512 instruction set. It is widely compatible with various compilers, languages, operating systems, linking, and threading models.<\/li><li>Intel\u00ae Math Kernel Library for Deep Neural Networks (Intel\u00ae MKL-DNN): An open-source performance-enhanced library used to accelerate deep learning frameworks on Intel\u00ae hardware.<\/li><li>Intel\u00ae Distribution for Python: Accelerates AI-related Python libraries (such as NumPy, SciPy, and scikit-learn) using integrated Intel\u00ae performance libraries (such as Intel\u00ae MKL), thus improving AI inference speed.<\/li><li>Framework optimization: Intel collaborates with Google, Apache, and Baidu for TensorFlow, MXNet, and PaddlePaddle platforms, respectively, and actively develops technologies related to Caffe and PyTorch. Software optimizations tailored for Intel\u00ae Xeon\u00ae Scalable processors within data centers are employed to enhance deep learning performance, with ongoing efforts to incorporate frameworks from other industry leaders.<\/li><\/ul>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-a1e3492 e-flex e-con-boxed elementor-invisible e-con e-parent\" data-id=\"a1e3492\" data-element_type=\"container\" data-e-type=\"container\" data-settings=\"{&quot;animation&quot;:&quot;fadeInLeft&quot;}\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-d55a0c1 elementor-widget elementor-widget-heading\" data-id=\"d55a0c1\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Intel\u00ae Xeon\u00ae Scalable Processor<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-eeedad3 elementor-widget elementor-widget-text-editor\" data-id=\"eeedad3\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>The second-generation Intel\u00ae Xeon\u00ae Scalable Processor:<\/p><ul><li>Provides scalability in an economical, efficient, and flexible manner, spanning from multi-cloud environments to intelligent edge.<\/li><li>Establishes a seamless performance foundation, helping accelerate the transformative impact of data.<\/li><li>Supports groundbreaking Intel\u00ae Optane\u2122 Persistent Memory technology.<\/li><li>Enhances AI performance and helps the entire data center become AI-ready.<\/li><li>Offers hardware-enhanced platform protection and threat monitoring.<\/li><\/ul>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-7311a3a e-flex e-con-boxed elementor-invisible e-con e-parent\" data-id=\"7311a3a\" data-element_type=\"container\" data-e-type=\"container\" data-settings=\"{&quot;animation&quot;:&quot;fadeInLeft&quot;}\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-1d96464 elementor-widget elementor-widget-heading\" data-id=\"1d96464\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Deploys optimized high-speed AI inference on industry-standard hardware.<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-d3b5577 elementor-widget elementor-widget-text-editor\" data-id=\"d3b5577\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>The workload-optimized configurations provided by Intel\u00ae Select Solutions are validated for Intel\u00ae Xeon\u00ae Scalable Processors, serving as a shortcut to achieving data center transformation. By selecting Intel\u00ae Select Solutions for AI inference, enterprises and organizations can access pre-tuned, tested, and real-world-tested configurations that support scalable optimization. This allows IT departments to deploy AI inference quickly and efficiently in production environments. Additionally, selecting Intel\u00ae Select Solutions for AI inference enables IT departments to achieve high-speed AI inference on hardware they are familiar with and accustomed to deploying and managing.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t","protected":false},"excerpt":{"rendered":"<p>More and more businesses are hoping to leverage Artificial Intelligence (AI) to increase revenue, improve efficiency, and drive product innovation. It is particularly noteworthy that AI use cases based on Deep Learning (DL) technology can bring effective and practical insights; some of these use cases can advance progress in various industries, such as: Image Classification [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":801,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[13],"tags":[],"class_list":["post-979","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-solutions"],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/shyuxinpc.com\/cn\/wp-json\/wp\/v2\/posts\/979","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/shyuxinpc.com\/cn\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/shyuxinpc.com\/cn\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/shyuxinpc.com\/cn\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/shyuxinpc.com\/cn\/wp-json\/wp\/v2\/comments?post=979"}],"version-history":[{"count":4,"href":"https:\/\/shyuxinpc.com\/cn\/wp-json\/wp\/v2\/posts\/979\/revisions"}],"predecessor-version":[{"id":983,"href":"https:\/\/shyuxinpc.com\/cn\/wp-json\/wp\/v2\/posts\/979\/revisions\/983"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/shyuxinpc.com\/cn\/wp-json\/wp\/v2\/media\/801"}],"wp:attachment":[{"href":"https:\/\/shyuxinpc.com\/cn\/wp-json\/wp\/v2\/media?parent=979"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/shyuxinpc.com\/cn\/wp-json\/wp\/v2\/categories?post=979"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/shyuxinpc.com\/cn\/wp-json\/wp\/v2\/tags?post=979"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}