Skip to content

Commit

Permalink
Python: init cleanup (#5872)
Browse files Browse the repository at this point in the history
### Motivation and Context

<!-- Thank you for your contribution to the semantic-kernel repo!
Please help reviewers and future users, providing the following
information:
  1. Why is this change required?
  2. What problem does it solve?
  3. What scenario does it contribute to?
  4. If it fixes an open issue, please link to the issue here.
-->

When the root init has to load a lot of stuff, loading the whole thing
becomes slower, only one concept there Kernel, all other things must now
be loaded from a sub package.

The following is not the guidance:
- imports within SK use the full path
- init files are created as much as possible on the root+1 level (like
semantic_kernel.functions) with the pieces that common developers need,
except in connectors and utils, there a developer needs to go deeper
- within the connectors folder this is further detailed first to ai,
memory, search_engine, within ai it is further spread into the different
connectors, for instance everything for openai and azure openai can be
imported using `from semantic_kernel.connectors.ai.open_ai import ...`,
the same within memory
- imports in samples use the abbreviated path as much as possible.

### Description

<!-- Describe your changes, the overall approach, the underlying design.
These notes will help understanding how your code works. Thanks! -->

### Contribution Checklist

<!-- Before submitting this PR, please make sure: -->

- [x] The code builds clean without any errors or warnings
- [x] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [x] All unit tests pass, and I have added new tests where possible
- [ ] I didn't break anyone 😄
  • Loading branch information
eavanvalkenburg committed Apr 16, 2024
1 parent beef63c commit 66a3b5b
Show file tree
Hide file tree
Showing 62 changed files with 2,364 additions and 2,462 deletions.
22 changes: 13 additions & 9 deletions python/notebooks/00-getting-started.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
"source": [
"# Setup\n",
"\n",
"**Step 1**: Import Semantic Kernel SDK from pypi.org"
"**Step 1**: Import Semantic Kernel SDK from pypi.org\n"
]
},
{
Expand All @@ -25,16 +25,16 @@
"metadata": {},
"outputs": [],
"source": [
"import semantic_kernel as sk\n",
"from semantic_kernel import Kernel\n",
"\n",
"kernel = sk.Kernel()"
"kernel = Kernel()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Configure the service you'd like to use via the `Service` Enum."
"### Configure the service you'd like to use via the `Service` Enum.\n"
]
},
{
Expand Down Expand Up @@ -75,7 +75,7 @@
"AZURE_OPENAI_DEPLOYMENT_NAME=\"...\"\n",
"```\n",
"\n",
"Use \"keyword arguments\" to instantiate an Azure OpenAI Chat Completion service and add it to the kernel:"
"Use \"keyword arguments\" to instantiate an Azure OpenAI Chat Completion service and add it to the kernel:\n"
]
},
{
Expand All @@ -84,19 +84,21 @@
"metadata": {},
"outputs": [],
"source": [
"from semantic_kernel.utils.settings import azure_openai_settings_from_dot_env, openai_settings_from_dot_env\n",
"\n",
"service_id = None\n",
"if selectedService == Service.OpenAI:\n",
" from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion\n",
"\n",
" api_key, org_id = sk.openai_settings_from_dot_env()\n",
" api_key, org_id = openai_settings_from_dot_env()\n",
" service_id = \"default\"\n",
" kernel.add_service(\n",
" OpenAIChatCompletion(service_id=service_id, ai_model_id=\"gpt-3.5-turbo-1106\", api_key=api_key, org_id=org_id),\n",
" )\n",
"elif selectedService == Service.AzureOpenAI:\n",
" from semantic_kernel.connectors.ai.open_ai import AzureChatCompletion\n",
"\n",
" deployment, api_key, endpoint = sk.azure_openai_settings_from_dot_env()\n",
" deployment, api_key, endpoint = azure_openai_settings_from_dot_env()\n",
" service_id = \"default\"\n",
" kernel.add_service(\n",
" AzureChatCompletion(service_id=service_id, deployment_name=deployment, endpoint=endpoint, api_key=api_key),\n",
Expand All @@ -110,7 +112,7 @@
"source": [
"# Run a Semantic Function\n",
"\n",
"**Step 3**: Load a Plugin and run a semantic function:"
"**Step 3**: Load a Plugin and run a semantic function:\n"
]
},
{
Expand All @@ -128,9 +130,11 @@
"metadata": {},
"outputs": [],
"source": [
"from semantic_kernel.functions import KernelArguments\n",
"\n",
"joke_function = plugin[\"Joke\"]\n",
"\n",
"joke = await kernel.invoke(joke_function, sk.KernelArguments(input=\"time travel to dinosaur age\", style=\"super silly\"))\n",
"joke = await kernel.invoke(joke_function, KernelArguments(input=\"time travel to dinosaur age\", style=\"super silly\"))\n",
"print(joke)"
]
}
Expand Down
22 changes: 12 additions & 10 deletions python/notebooks/01-basic-loading-the-kernel.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# Basic Loading of the Kernel"
"# Basic Loading of the Kernel\n"
]
},
{
Expand All @@ -14,9 +14,9 @@
"metadata": {},
"source": [
"To run the notebooks we recommend using Poetry and starting a shell with a virtual environment\n",
"prepared to use SK. \n",
"prepared to use SK.\n",
"\n",
"See [DEV_SETUP.md](../../python/DEV_SETUP.md) for more information."
"See [DEV_SETUP.md](../../python/DEV_SETUP.md) for more information.\n"
]
},
{
Expand All @@ -34,7 +34,9 @@
"metadata": {},
"outputs": [],
"source": [
"import semantic_kernel as sk"
"from semantic_kernel import Kernel, kernel\n",
"\n",
"kernel = Kernel()"
]
},
{
Expand All @@ -46,7 +48,7 @@
"\n",
"The SDK currently supports OpenAI and Azure OpenAI, among other connectors.\n",
"\n",
"If you need an Azure OpenAI key, go [here](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/quickstart?pivots=rest-api)."
"If you need an Azure OpenAI key, go [here](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/quickstart?pivots=rest-api).\n"
]
},
{
Expand All @@ -67,21 +69,21 @@
"metadata": {},
"outputs": [],
"source": [
"kernel = sk.Kernel()\n",
"\n",
"service_id = None\n",
"if selectedService == Service.OpenAI:\n",
" from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion\n",
" from semantic_kernel.utils.settings import openai_settings_from_dot_env\n",
"\n",
" api_key, org_id = sk.openai_settings_from_dot_env()\n",
" api_key, org_id = openai_settings_from_dot_env()\n",
" service_id = \"oai_chat_gpt\"\n",
" kernel.add_service(\n",
" OpenAIChatCompletion(service_id=service_id, ai_model_id=\"gpt-3.5-turbo-1106\", api_key=api_key, org_id=org_id),\n",
" )\n",
"elif selectedService == Service.AzureOpenAI:\n",
" from semantic_kernel.connectors.ai.open_ai import AzureChatCompletion\n",
" from semantic_kernel.utils.settings import azure_openai_settings_from_dot_env\n",
"\n",
" deployment, api_key, endpoint = sk.azure_openai_settings_from_dot_env()\n",
" deployment, api_key, endpoint = azure_openai_settings_from_dot_env()\n",
" service_id = \"aoai_chat_completion\"\n",
" kernel.add_service(\n",
" AzureChatCompletion(service_id=service_id, deployment_name=deployment, endpoint=endpoint, api_key=api_key),\n",
Expand All @@ -93,7 +95,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Great, now that you're familiar with setting up the Semantic Kernel, let's see [how we can use it to run prompts](02-running-prompts-from-file.ipynb)."
"Great, now that you're familiar with setting up the Semantic Kernel, let's see [how we can use it to run prompts](02-running-prompts-from-file.ipynb).\n"
]
}
],
Expand Down
38 changes: 20 additions & 18 deletions python/notebooks/02-running-prompts-from-file.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -7,15 +7,16 @@
"metadata": {},
"source": [
"# How to run a prompt plugins from file\n",
"Now that you're familiar with Kernel basics, let's see how the kernel allows you to run Prompt Plugins and Prompt Functions stored on disk. \n",
"\n",
"A Prompt Plugin is a collection of Semantic Functions, where each function is defined with natural language that can be provided with a text file. \n",
"Now that you're familiar with Kernel basics, let's see how the kernel allows you to run Prompt Plugins and Prompt Functions stored on disk.\n",
"\n",
"A Prompt Plugin is a collection of Semantic Functions, where each function is defined with natural language that can be provided with a text file.\n",
"\n",
"Refer to our [glossary](https://github.com/microsoft/semantic-kernel/blob/main/docs/GLOSSARY.md) for an in-depth guide to the terms.\n",
"\n",
"The repository includes some examples under the [samples](https://github.com/microsoft/semantic-kernel/tree/main/samples) folder.\n",
"\n",
"For instance, [this](../../plugins/FunPlugin/Joke/skprompt.txt) is the **Joke function** part of the **FunPlugin plugin**:"
"For instance, [this](../../plugins/FunPlugin/Joke/skprompt.txt) is the **Joke function** part of the **FunPlugin plugin**:\n"
]
},
{
Expand All @@ -34,7 +35,7 @@
"+++++\n",
"{{$input}}\n",
"+++++\n",
"```"
"```\n"
]
},
{
Expand All @@ -43,9 +44,9 @@
"id": "afdb96d6",
"metadata": {},
"source": [
"Note the special **`{{$input}}`** token, which is a variable that is automatically passed when invoking the function, commonly referred to as a \"function parameter\". \n",
"Note the special **`{{$input}}`** token, which is a variable that is automatically passed when invoking the function, commonly referred to as a \"function parameter\".\n",
"\n",
"We'll explore later how functions can accept multiple variables, as well as invoke other functions."
"We'll explore later how functions can accept multiple variables, as well as invoke other functions.\n"
]
},
{
Expand All @@ -54,7 +55,6 @@
"id": "c3bd5134",
"metadata": {},
"source": [
"\n",
"In the same folder you'll notice a second [config.json](../../plugins/FunPlugin/Joke/config.json) file. The file is optional, and is used to set some parameters for large language models like Temperature, TopP, Stop Sequences, etc.\n",
"\n",
"```\n",
Expand Down Expand Up @@ -84,7 +84,7 @@
" ]\n",
"}\n",
"\n",
"```"
"```\n"
]
},
{
Expand All @@ -95,7 +95,7 @@
"source": [
"Given a prompt function defined by these files, this is how to load and use a file based prompt function.\n",
"\n",
"Load and configure the kernel, as usual, loading also the AI service settings defined in the [Setup notebook](00-getting-started.ipynb):"
"Load and configure the kernel, as usual, loading also the AI service settings defined in the [Setup notebook](00-getting-started.ipynb):\n"
]
},
{
Expand Down Expand Up @@ -128,23 +128,25 @@
"metadata": {},
"outputs": [],
"source": [
"import semantic_kernel as sk\n",
"from semantic_kernel import Kernel\n",
"\n",
"kernel = sk.Kernel()\n",
"kernel = Kernel()\n",
"\n",
"service_id = None\n",
"if selectedService == Service.OpenAI:\n",
" from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion\n",
" from semantic_kernel.utils.settings import openai_settings_from_dot_env\n",
"\n",
" api_key, org_id = sk.openai_settings_from_dot_env()\n",
" api_key, org_id = openai_settings_from_dot_env()\n",
" service_id = \"default\"\n",
" kernel.add_service(\n",
" OpenAIChatCompletion(service_id=service_id, ai_model_id=\"gpt-3.5-turbo-1106\", api_key=api_key, org_id=org_id),\n",
" )\n",
"elif selectedService == Service.AzureOpenAI:\n",
" from semantic_kernel.connectors.ai.open_ai import AzureChatCompletion\n",
" from semantic_kernel.utils.settings import azure_openai_settings_from_dot_env\n",
"\n",
" deployment, api_key, endpoint = sk.azure_openai_settings_from_dot_env()\n",
" deployment, api_key, endpoint = azure_openai_settings_from_dot_env()\n",
" service_id = \"default\"\n",
" kernel.add_service(\n",
" AzureChatCompletion(service_id=service_id, deployment_name=deployment, endpoint=endpoint, api_key=api_key),\n",
Expand All @@ -157,7 +159,7 @@
"id": "fd5ff1f4",
"metadata": {},
"source": [
"Import the plugin and all its functions:"
"Import the plugin and all its functions:\n"
]
},
{
Expand All @@ -170,7 +172,7 @@
"# note: using plugins from the samples folder\n",
"plugins_directory = \"../../samples/plugins\"\n",
"\n",
"funFunctions = kernel.import_plugin_from_prompt_directory(plugins_directory, \"FunPlugin\")\n",
"funFunctions = kernel.add_plugin(parent_directory=plugins_directory, plugin_name=\"FunPlugin\")\n",
"\n",
"jokeFunction = funFunctions[\"Joke\"]"
]
Expand All @@ -181,7 +183,7 @@
"id": "edd99fa0",
"metadata": {},
"source": [
"How to use the plugin functions, e.g. generate a joke about \"*time travel to dinosaur age*\":"
"How to use the plugin functions, e.g. generate a joke about \"_time travel to dinosaur age_\":\n"
]
},
{
Expand All @@ -191,7 +193,7 @@
"metadata": {},
"outputs": [],
"source": [
"result = await kernel.invoke(jokeFunction, sk.KernelArguments(input=\"travel to dinosaur age\", style=\"silly\"))\n",
"result = await kernel.invoke(jokeFunction, input=\"travel to dinosaur age\", style=\"silly\")\n",
"print(result)"
]
},
Expand All @@ -201,7 +203,7 @@
"id": "2281a1fc",
"metadata": {},
"source": [
"Great, now that you know how to load a plugin from disk, let's show how you can [create and run a prompt function inline.](./03-prompt-function-inline.ipynb)"
"Great, now that you know how to load a plugin from disk, let's show how you can [create and run a prompt function inline.](./03-prompt-function-inline.ipynb)\n"
]
}
],
Expand Down

0 comments on commit 66a3b5b

Please sign in to comment.