Jump to content
Microsoft Windows Bulletin Board

Windows Server

Active Members
  • Posts

    5694
  • Joined

  • Last visited

Everything posted by Windows Server

  1. I changed my folder color by right-clicking and 'show more options', I think it was above 'create shortcut'. This was without 3rd party software, when my laptop had factory settings. Now I can't change the color anymore. View the full article
  2. I had this issue that with windows 11, for some reason the windows copy to clipboard took a while, like a two or three second delay between the screenshot was taken and it being copied, and I was looking for a solution, and so far nobody had a real solution but I think I found a good one to fix this without disabling snipping tool. View the full article
  3. I heard it is totally possible to bypass Windows 11 system requirements and install Windows 11 on unsupported PC. But it seems most of the tricks does not work for Windows 24H2. When I tried to upgrade my PC from Windows 10 Pro to Windows 11 Pro, it says: This PC doesn't currently meet windows 11 system requirements. The hardware specs of my PC: Intel i7 47708GB Kingston DDR3 RAM256 GB SanDisk Extreme ProUEFI BIOSSecure BootAs far as I know, the CPU is unsupported by Windows 11 and the PC does not have TPM 2.0 chip. I am looking for a simple way to bypass Windows 11 requirements so I can install and run Windows 11 on this unsupported hardware. P.S. I already downloaded the latest Windows 24H2 ISO from Microsoft on my computer. Thanks View the full article
  4. I want to put PC in sleep mode but it keeps turning back on almost instantly. Been having this issue for several weeks but before that it never did this. I ran the command prompt and did the "powercfg/lastwake" and it's showing this text below as the reason it's turning back on. Anyone know how to resolve this issue and what this means? View the full article
  5. I installed Windows 11 on a non-TPM computer ages ago, using Rufus. But now I'm told my version 21H2 has reached end of service and when I try to update using the Windows11InstallationAssistant, it tells me TPM is needed. I already change the registry and rebooted as I found on this site: Navigate to HKEY_LOCAL_MACHINE\SYSTEM\Setup. Create a new registry key under Setup and name it LabConfig. Within LabConfig, create DWORDs values BypassTPMCheck and BypassSecureBootCheck and set each to 1. But it still detects this absence of TPM. How do I get past this? (without the hassle of ISOs and booting off other media), it must be possible. View the full article
  6. I downloaded the little Windows tool, and it told me my PC is not compatible with Windows 11, but it's basically because of the lack of the TPM 2.0 module. Specs: CPU: Intel Core i7 8700k Mobo: Gigabyte Z370 Aorus Gaming Ram: 32gb DDR4 3200 I'm thinking about upgrading my whole system (Ryzen 7 9700x CPU, with compatible mobo and RAM), instead of bypassing the W11 TPM 2.0 module requirement, all because I've heard that down the line there may be some "incompatibility" or security issues that Windows may require TPM 2.0 for. And I've also heard that buying a TPM 2.0 module for my mobo just to make it compatible with W11 is not really the best idea either. View the full article
  7. We are excited to announce the latest update to GitHub Copilot for Azure, featuring a brand-new updating experience for your infrastructure needs. This update makes it easier than ever to generate customized Infrastructure as Code (IaC) files for your application. What's New? With this new feature, you can update project information, host and target services, binding information, and environment variables in a more intuitive UI before asking GitHub Copilot for Azure to generate IaC files. How to Use the New Feature Make sure you have the latest versions of GitHub Copilot, GitHub Copilot Chat, and GitHub Copilot for Azure extensions installed. VSCode extensions for GitHub Copilot, GitHub Copilot Chat, and GitHub Copilot for Azure Open GitHub Copilot Chat and ask it to recommend Azure services for my app. Ask GitHub Copilot chat to recommend Azure services for my app Click on the “Update” button under “You can make more detailed changes to the recommendations by clicking the 'Update' button.” A new tab will show up. Previously the update process required you to interact with Copilot chat, now you can update everything within this easy-to-use update tab. Click on the update button in the recommendations to open the IaC editor In the update panel, you can update the project information. Update project information You can update the hosting service. Currently, only Azure Container App and Azure App Service are supported. Update hosting service You can update the target service and bindings Update target service and bindings You can update the environment variables. Update environment variables Once you are satisfied with the changes, you can save and generate IaC files. Save and generate IaC files Try It Out Today! We invite you to explore this new experience for yourself. Your feedback is invaluable to us, so please don't hesitate to share your thoughts and suggestions. Try out the new release today and see how GitHub Copilot for Azure can take your coding to the next level! View the full article
  8. Hi all, My old Windows 10 PC is retiring and I bought a new Windows 11 PC. I heard it could be a good choice to run Linux on a budget and old machine. Before doing that, I need to create a Linux bootable USB first. Does any one know how to burn Linux ISO to USB on a Windows 11/10 PC? I heard the official Windows Media Creation Tool but the USB is not recognized as a bootable device. Am I using the wrong tool? Thanks View the full article
  9. There are a lot of material, trainings, articles and more around to start and expand with Microsoft Copilot Studio. Below I assembled a selection of them which I find very helpful to get into Microsoft Copilot Studio. Especially the workshops are guiding you through end-to-end scenarios creating Agents which you than can use and extend based on needs. TopicDescriptionLink Copilot Developer Camp workshop for makers and professional developers who want to learn how to build agents for Microsoft 365 Copilot. https://microsoft.github.io/copilot-camp/ Microsoft Copilot Studio Samples This repository contains samples and artifacts for Microsoft Copilot Studio https://github.com/microsoft/CopilotStudioSamples/tree/main Designing your own copilot using Copilot Studio (L300) lab This lab teaches you how to design your own copilot with Copilot Studio. https://microsoft.github.io/TechExcel-Designing-your-own-copilot-using-copilot-studio/ Microsoft 365 Copilot Samples This repository contains samples that show how to write agents and plugins for Microsoft 365 Copilot. https://github.com/OfficeDev/Microsoft-365-Copilot-Samples Copilot learning hub Find what you, a technical professional, need to enhance your productivity, creativity, and data accessibility. https://learn.microsoft.com/en-us/copilot/ Build your first Agent Build your first agent with Azure AI Agent Service https://microsoft.github.io/build-your-first-agent-with-azure-ai-agent-service-workshop/ Generative AI for Beginners (Version 3) - A Course Learn the fundamentals of building Generative AI applications with our 21-lesson comprehensive course by Microsoft Cloud Advocates. https://github.com/microsoft/generative-ai-for-beginners View the full article
  10. TOC Introduction Setup References 1. Introduction Many enterprises prefer not to use App Keys to invoke Function App triggers, as they are concerned that these fixed strings might be exposed. This method allows you to invoke Function App triggers using Managed Identity for enhanced security. I will provide examples in both Bash and Node.js. 2. Setup 1. Create a Linux Python 3.11 Function App 1.1. Configure Authentication to block unauthenticated callers while allowing the Web App’s Managed Identity to authenticate. Identity Provider Microsoft Choose a tenant for your application and it's users Workforce Configuration App registration type Create Name [automatically generated] Client Secret expiration [fit-in your business purpose] Supported Account Type Any Microsoft Entra Directory - Multi-Tenant Client application requirement Allow requests from any application Identity requirement Allow requests from any identity Tenant requirement Use default restrictions based on issuer Token store [checked] 1.2. Create an anonymous trigger. Since your app is already protected by App Registration, additional Function App-level protection is unnecessary; otherwise, you will need a Function Key to trigger it. 1.3. Once the Function App is configured, try accessing the endpoint directly—you should receive a 401 Unauthorized error, confirming that triggers cannot be accessed without proper Managed Identity authorization. 1.4. After making these changes, wait 10 minutes for the settings to take effect. 2. Create a Linux Node.js 20 Web App and Obtain an Access Token and Invoke the Function App Trigger Using Web App (Bash Example) 2.1. Enable System Assigned Managed Identity in the Web App settings. 2.2. Open Kudu SSH Console for the Web App. 2.3. Run the following commands, making the necessary modifications: subscriptionsID → Replace with your Subscription ID. resourceGroupsID → Replace with your Resource Group ID. application_id_uri → Replace with the Application ID URI from your Function App’s App Registration. https://az-9640-faapp.azurewebsites.net/api/test_trigger → Replace with the corresponding Function App trigger URL. # Please setup the target resource to yours subscriptionsID="01d39075-XXXX-XXXX-XXXX-XXXXXXXXXXXX" resourceGroupsID="XXXX" # Variable Setting (No need to change) identityEndpoint="$IDENTITY_ENDPOINT" identityHeader="$IDENTITY_HEADER" application_id_uri="api://9c0012ad-XXXX-XXXX-XXXX-XXXXXXXXXXXX" # Install necessary tool apt install -y jq # Get Access Token tokenUri="${identityEndpoint}?resource=${application_id_uri}&api-version=2019-08-01" accessToken=$(curl -s -H "Metadata: true" -H "X-IDENTITY-HEADER: $identityHeader" "$tokenUri" | jq -r '.access_token') echo "Access Token: $accessToken" # Run Trigger response=$(curl -s -o response.json -w "%{http_code}" -X GET "https://az-9640-myfa.azurewebsites.net/api/my_test_trigger" -H "Authorization: Bearer $accessToken") echo "HTTP Status Code: $response" echo "Response Body:" cat response.json 2.4. If everything is set up correctly, you should see a successful invocation result. 3. Invoke the Function App Trigger Using Web App (nodejs Example) I have also provide my example, which you can modify accordingly and save it to /home/site/wwwroot/callFunctionApp.js and run it cd /home/site/wwwroot/ vi callFunctionApp.js npm init -y npm install azure/identity axios node callFunctionApp.js// callFunctionApp.js const { DefaultAzureCredential } = require("@azure/identity"); const axios = require("axios"); async function callFunctionApp() { try { const applicationIdUri = "api://9c0012ad-XXXX-XXXX-XXXX-XXXXXXXXXXXX"; // Change here const credential = new DefaultAzureCredential(); console.log("Requesting token..."); const tokenResponse = await credential.getToken(applicationIdUri); if (!tokenResponse || !tokenResponse.token) { throw new Error("Failed to acquire access token"); } const accessToken = tokenResponse.token; console.log("Token acquired:", accessToken); const apiUrl = "https://az-9640-myfa.azurewebsites.net/api/my_test_trigger"; // Change here console.log("Calling the API now..."); const response = await axios.get(apiUrl, { headers: { Authorization: `Bearer ${accessToken}`, }, }); console.log("HTTP Status Code:", response.status); console.log("Response Body:", response.data); } catch (error) { console.error("Failed to call the function", error.response ? error.response.data : error.message); } } callFunctionApp(); Below is my execution result: 3. References Tutorial: Managed Identity to Invoke Azure Functions | Microsoft Learn How to Invoke Azure Function App with Managed Identity | by Krizzia 🤖 | Medium Configure Microsoft Entra authentication - Azure App Service | Microsoft Learn View the full article
  11. Hi there, I am implementing Sensitivity Labels for an organization. They have a couple of SharePoint Classic sites that are required to be labelled. These sites have the old UI and as a result have no Site Information option to choose a label. Does anyone know if we can apply a label to classic sites from Admin Center or using PowerShell? View the full article
  12. Motive Consulting is excited to launch our new training course Power BI Beginner for Not-for-profits! Over the past few years, we have had the opportunity to create Power BI reports for a variety of not-for-profit organisations. During this time, we noticed a gap in training options available to NFPs, with most existing courses focusing on corporate themes such as sales and profit. Our new course bridges this gap by providing content designed specifically for NFPs. It draws on examples and datasets related to the Australian social service sector, ensuring the training is relevant and practical. As with all of our consulting and training services, we are offering discounted pricing for NFP organisations. Course highlights: 💡 No Power BI experience required 📑 Course manual and exercise files provided 🎓 Led by an experienced Microsoft Certified Trainer 💻 Delivered remotely via Microsoft Teams 🕒 Full day course from 9am to 4pm AEST Course dates: 📆 8 April 📆 21 May 📆 24 June ▶️ For more information, or to book now, visit: https://www.motiveconsulting.com.au/training-services/power-bi-beginner-not-for-profits You can also follow us on LinkedIn at https://www.linkedin.com/company/motive-consulting View the full article
  13. You may need to access storage files for your site, whether it is a Logic App Standard, Function App, or App Service. Depending on your ASP SKU, these files can be accessed using FTP/FTPS. Some customers encounter difficulties when attempting to connect using Implicit/Explicit FTPS. This post aims to simplify this process by utilizing a Logic App to list files, retrieve file content, and update files. Explicit FTPS connection will be used in this scenario as it is the one mutually supported by the FTP Connector and by the FTP site when using FTPS. Steps: Create a user/pass credentials to access the FTP site. You can do it from the Portal or using CLI. You can run a command Shell from the below reference to execute the command. Reference: Configure deployment credentials - Azure App Service | Microsoft Learn CLI: az webapp deployment user set --user-name <username> --password <password> Portal: Enable FTP Basic Authentication on the Destination (Logic App Standard, Function App, App Services): It is highly advised to use "FTPS only" connection as it provides a secure encrypted connection. To disable unencrypted FTP, select FTPS Only in FTP state. To disable both FTP and FTPS entirely, select Disabled. When finished, select Save. If using FTPS Only, you must enforce TLS 1.2 or higher by navigating to the TLS/SSL settings page of your web app. TLS 1.0 and 1.1 aren't supported with FTPS Only. Reference: Deploy content using FTP/S - Azure App Service | Microsoft Learn The FTP Connector supports Explicit connection only: FTP - Connectors | Microsoft Learn For secure FTP, make sure to set up explicitFile Transfer Protocol Secure (FTPS), rather than implicit FTPS. Also, some FTP servers, such as ProFTPd, require that you enable the NoSessionReuseRequired option if you use Transport Layer Security (TLS) mode, the successor to Secure Socket Layer (SSL). The FTP connector doesn't work with implicit FTPS and supports only explicit FTP over FTPS, which is an extension of TLS. Create Logic App and FTP Connection: Create the Logic App workflow, add an FTP Action to List the Files, or any FTP Action based on your requirements. To test the connection for the first time, I recommend using the "List files in folder" Action. In the connection configuration: Server Address: xxxxxxx.ftp.azurewebsites.windows.net (Get this value from the Properties of the Destination service, don't add the "ftps://" section nor the "/site/wwwroot" section) User Name and Password: xxxxxx\xxxxx (This is what we created in the FTP credentials tab under the Deployment Center, in the User scope section, or using the CLI command) FTP Server Port: 21 (use Port 21 to force the connection to be Explicit) Enabled SSL?: checked (use SSL to force the connection to use FTPS) Create Logic App FTP Connection: After creating the connection, use "/site/wwwroot" to access your folder: Test this and see if it works! Troubleshooting: Reference: Deploy content using FTP/S - Azure App Service | Microsoft Learn I recommend to secure the connection password using KeyVault. More on that below. Secure Parameters in Keyvault: Main steps: Put the connection string in KeyVault. Give Access to the Logic App on KeyVault. Add the reference in App Settings for the Logic App. The steps mentioned here: Use Key Vault references - Azure App Service | Microsoft Learn Example of this at the end of this article: A walkthrough of parameterization of different connection types in Logic App Standard | Microsoft Community Hub And that's how you access those files! You can make use of this secure connection for multiple tasks based on your requirements.View the full article
  14. Your database should be part of a wholistic development process, where iterative development tools are coupled with automation for validation and deployment. As previously announced, the Microsoft.Build.Sql project SDK provides a cross-platform framework for your database-as-code such that the database obejcts are ready to be checked into source control and deployed via pipelines like any other modern application component. Today Microsoft.Build.Sql enters general availability as another step in the evolution of SQL database development. Standardized SQL database as code SQL projects are a .NET-based project type for SQL objects, compiling a folder of SQL scripts into a database artifact (.dacpac) for manual or continuous deployments. As a developer working with SQL projects, you’re creating the T-SQL scripts that define the objects in the database. While the development framework around SQL projects presents a clear build and deploy process for development, there’s no wrong way to incorporate SQL projects into your development cycle. The SQL objects in the project can be manually written or generated via automation, including through the graphical schema compare interfaces or the SqlPackage extract command. Whether you’re developing with SQL Server, Azure SQL, or SQL in Fabric, database development standardizes on a shared project format and the ecosystem of tooling around SQL projects. The same SQL projects tools, like the SqlPackage CLI, can be used to either deploy objects to a database or update those object scripts from a database. Free development tools for SQL projects, like the SQL database projects extension for VS Code and SQL Server Data Tools in Visual Studio, bring the whole development team together. The database model validation of a SQL project build provides early verification of the SQL syntax used in the project, before code is checked in or deployed. Code analysis for antipatterns that impact database design and performance can be enabled as part of the project build and extended. This code analysis capability adds in-depth feedback to your team’s continuous integration or pre-commit checks as part of SQL projects. Objects in a SQL project are database objects you can have confidence in before they’re deployed across your environments. Evolving from original SQL projects SQL projects converted to the Microsoft.Build.Sql SDK benefit from support for .NET 8, enabling cross-platform development and automation environments. While the original SQL project file format explicitly lists each SQL file, SDK-style projects are significantly simplified by including any .sql file in the SQL projects folder structure. Database references enable SQL projects to be constructed for applications where a single project isn’t an effective representation, whether the database includes cross-database references or multiple development cycles contribute to the same database. Incorporate additional objects into a SQL project with database references through project reference, .dacpac artifact reference, and new to Microsoft.Build.Sql, package references. Package references for database objects improve the agility and manageability of the release cycle of your database through improved visibility to versioning and simplified management of the referenced artifacts. Converting existing projects The Microsoft.Build.Sql project SDK is a superset of the functionality of the original SQL projects, enabling you to convert your current projects on a timeline that works best for you. The original SQL projects in SQL Server Data Tools (SSDT) continue to be supported through the Visual Studio lifecycle, providing years of support for your existing original projects. Converting an existing SQL project to a Microsoft.Build.Sql project is currently a manual process to add a single line to the project file and remove several groups of lines. The resulting Microsoft.Build.Sql project file is generally easier to understand and iteratively develop, with significantly fewer merge conflicts than the original SQL projects. A command line tool, DacpacVerify, is now available to validate that your project conversion has completed without degrading the output .dacpac file. By creating a .dacpac before and after you upgrade the project file, you can use DacpacVerify to confirm the database model, database options, pre/post-deployment scripts, and SQLCMD variables match. The road ahead With SQL Server 2025 on the horizon, support for the SQL Server 2025 target platform will be introduced in a future Microsoft.Build.Sql release along with additional improvements to the SDK references. Many Microsoft.Build.Sql releases will coincide with releases to the DacFx .NET library and the SqlPackage CLI with preview releases ahead of general availability releases several times a year. Feature requests and bug reports for the DacFx ecosystem, including Microsoft.Build.Sql, is managed through the GitHub repository. With the v1 GA of Microsoft.Build.Sql, we’re also looking ahead to continued iteration in the development tooling. In Visual Studio, the preview of SDK-style SSDT continues with new features introduced in each Visual Studio release. Plans for Visual Studio include project upgrade assistance in addition to the overall replacement of the existing SQL Server Data Tools. In the SQL projects extension for VS Code, we’re both ensuring SQL projects capabilities from Azure Data Studio are introduced as well as increasing the robustness of the VS Code project build experience. The Microsoft.Build.Sql project SDK empowers database development to integrate with the development cycle, whether you're focused on reporting, web development, AI, or anything else. Use Microsoft.Build.Sql projects to branch, build, commit, and ship your database – get started today from an existing database or with a new project. Get to know SQL projects from the documentation and DevOps samples. View the full article
  15. In today’s fast-paced digital landscape, High-Performance Computing (HPC) is a critical engine powering innovation across industries—from automotive and aerospace to energy and manufacturing. To keep pace with escalating performance demands and the need for agile, risk-free testing environments, AMD has partnered with Microsoft and leading Independent Software Vendors (ISVs) to introduce the AMD HPC Innovation Lab. This pioneering sandbox environment on Azure is a “try before you buy” solution designed to empower customers to run their HPC workloads, assess performance, and experience AMD’s newest hardware innovations that deliver enhanced performance, scalability, and consistency—all without any financial commitments. Introducing the AMD Innovation Lab: A New Paradigm in Customer Engagement The AMD HPC Innovation Lab represents a paradigm shift in customer engagement for HPC solutions. Traditionally, organizations had to invest significant time and resources to build and manage on-premises testing environments, dealing with challenges such as hardware maintenance, scalability issues, and high operational costs. Without the opportunity to fully explore the benefits of cloud solutions through a trial offer, they often missed out on the advantages of cloud computing. With this innovative lab, customers now have the opportunity to experiment with optimized HPC environments in a simple, user-friendly interface. The process is straightforward: upload your input file or choose from the pre-configured options, run your workload, and then download your output file for analysis. This streamlined approach allows businesses to compare performance results on an apples-to-apples basis against other providers or existing on-premises setups. Empowering Decision Makers For Business Decision Makers (BDMs) and Technical Decision Makers (TDMs), the lab offers a compelling value proposition. It eliminates the complexities and uncertainties often associated with traditional testing environments by providing a risk-free opportunity to: Thoroughly Evaluate Performance: With access to AMD’s cutting-edge chipsets and Azure’s robust cloud infrastructure, organizations can conduct detailed proof-of-concept evaluations without incurring long-term costs. Accelerate Decision-Making: The streamlined testing process not only speeds up the evaluation phase but also accelerates the overall time to value, enabling organizations to make informed decisions quickly. Optimize Infrastructure: Created in partnership with ISVs and optimized by both AMD and Microsoft, the lab ensures that the infrastructure is fine-tuned for HPC workloads. This guarantees that performance assessments are both accurate and reflective of real-world scenarios. Seamless Integration with Leading ISVs A notable strength of the AMD HPC Innovation Lab is its collaborative design with top ISVs like Ansys, Altair, Siemens, and others. These partnerships ensure that the lab’s environment is equipped with industry-leading applications and solvers, such as Ansys Fluent for fluid dynamics and Ansys Mechanical for structural analysis. Each solver is pre-configured to provide a balanced and consistent performance evaluation, ensuring that users can benchmark their HPC workloads against industry standards with ease. Sustainability and Scalability Beyond performance and ease-of-use, the AMD HPC Innovation Lab is built with sustainability in mind. By leveraging Azure’s scalable cloud infrastructure, businesses can conduct HPC tests without the overhead and environmental impact of maintaining additional on-premises resources. This not only helps reduce operational costs but also supports corporate sustainability goals by minimizing the carbon footprint associated with traditional HPC setups. An Exciting Future for HPC Testing The innovation behind the AMD HPC Innovation Lab is just the beginning. With plans to continuously expand the lab catalog and include more ISVs, the platform is set to evolve as a comprehensive testing ecosystem. This ongoing expansion will provide customers with an increasingly diverse set of tools and environments tailored to meet a wide array of HPC needs. Whether you’re evaluating performance for fluid dynamics, structural simulations, or electromagnetic fields, the lab’s growing catalog promises to deliver precise and actionable insights. Ready to Experience the Future of HPC? The AMD HPC Innovation Lab on Azure offers a unique and exciting opportunity for organizations looking to harness the power of advanced computing without upfront financial risk. With its intuitive interface, optimized infrastructure, and robust ecosystem of ISVs, this sandbox environment is a game-changer in HPC testing and validation. Take advantage of this no-cost, high-impact solution to explore, experiment, and experience firsthand the benefits of AMD-powered HPC on Azure. To learn more and sign up for the program, visit https://aka.ms/AMDInnovationLab/LearnMore View the full article
  16. **Event Date & Time: May 21, 2025, 12:00 - 1:00 PM GMT+8 Singapore ** Learn how to effectively plan, build, and deploy generative AI solutions. Join us at Microsoft ISV AI Envisioning Day: Get the Framework to Develop AI Solutions webinar. The course is designed to empower ISVs with the tools and frameworks needed to create a development plan for your preidentified AI use cases. What's in store: Framework Deep Dive: Understand the Capability Envisioning framework—a strategic approach to accelerate AI solution development. Considerations for development plan: Explore the technical and business considerations for creating AI solutions and discover tools designed to help you build transformative generative AI applications. Expert Insights: Learn from leading AI professionals at Microsoft who will share their experiences and best practices. Interactive Q&A: Engage with our experts and get your AI queries answered in real-time. Exclusive Resources: Access a wealth of materials to help you design and develop top-notch AI solutions. After attending, you’ll have practical knowledge on how to apply the framework to create your development plan and build your AI solution. How to Register Register now for the Microsoft ISV AI Envisioning Day: Get the Framework to Develop AI Solutions session. The session will be delivered in English. We look forward to seeing you at the Microsoft ISV AI Envisioning Day: Get the Framework to Develop AI Solutions webinar and helping you unlock the full potential of AI for your business! View the full article
  17. **Event Date & Time: June 17, 2025, 1:00 - 2:00 PM GTM+1 UK ** Learn how to effectively plan, build, and deploy generative AI solutions. Join us at Microsoft ISV AI Envisioning Day: Get the Framework to Develop AI Solutions webinar. The course is designed to empower ISVs with the tools and frameworks needed to create a development plan for your preidentified AI use cases. What's in store: Framework Deep Dive: Understand the Capability Envisioning framework—a strategic approach to accelerate AI solution development. Considerations for development plan: Explore the technical and business considerations for creating AI solutions and discover tools designed to help you build transformative generative AI applications. Expert Insights: Learn from leading AI professionals at Microsoft who will share their experiences and best practices. Interactive Q&A: Engage with our experts and get your AI queries answered in real-time. Exclusive Resources: Access a wealth of materials to help you design and develop top-notch AI solutions. After attending, you’ll have practical knowledge on how to apply the framework to create your development plan and build your AI solution. How to Register Register now for the Microsoft ISV AI Envisioning Day: Get the Framework to Develop AI Solutions session. The session will be delivered in English. We look forward to seeing you at the Microsoft ISV AI Envisioning Day: Get the Framework to Develop AI Solutions webinar and helping you unlock the full potential of AI for your business! View the full article
  18. **Event Date & Time: June 17, 2025, 10:00 - 11:00 AM PST ** Learn how to effectively plan, build, and deploy generative AI solutions. Join us at Microsoft ISV AI Envisioning Day: Get the Framework to Develop AI Solutions webinar. The course is designed to empower ISVs with the tools and frameworks needed to create a development plan for your preidentified AI use cases. What's in store: Framework Deep Dive: Understand the Capability Envisioning framework—a strategic approach to accelerate AI solution development. Considerations for development plan: Explore the technical and business considerations for creating AI solutions and discover tools designed to help you build transformative generative AI applications. Expert Insights: Learn from leading AI professionals at Microsoft who will share their experiences and best practices. Interactive Q&A: Engage with our experts and get your AI queries answered in real-time. Exclusive Resources: Access a wealth of materials to help you design and develop top-notch AI solutions. After attending, you’ll have practical knowledge on how to apply the framework to create your development plan and build your AI solution. How to Register Register now for the Microsoft ISV AI Envisioning Day: Get the Framework to Develop AI Solutions session. The session will be delivered in English. We look forward to seeing you at the Microsoft ISV AI Envisioning Day: Get the Framework to Develop AI Solutions webinar and helping you unlock the full potential of AI for your business! View the full article
  19. **Event Date & Time: May 29, 2025, 10:00 - 11:00 AM PST ** Learn how to effectively plan, build, and deploy generative AI solutions. Join us at Microsoft ISV AI Envisioning Day: Get the Framework to Develop AI Solutions webinar. The course is designed to empower ISVs with the tools and frameworks needed to create a development plan for your preidentified AI use cases. What's in store: Framework Deep Dive: Understand the Capability Envisioning framework—a strategic approach to accelerate AI solution development. Considerations for development plan: Explore the technical and business considerations for creating AI solutions and discover tools designed to help you build transformative generative AI applications. Expert Insights: Learn from leading AI professionals at Microsoft who will share their experiences and best practices. Interactive Q&A: Engage with our experts and get your AI queries answered in real-time. Exclusive Resources: Access a wealth of materials to help you design and develop top-notch AI solutions. After attending, you’ll have practical knowledge on how to apply the framework to create your development plan and build your AI solution. How to Register Register now for the Microsoft ISV AI Envisioning Day: Get the Framework to Develop AI Solutions session. The session will be delivered in English. We look forward to seeing you at the Microsoft ISV AI Envisioning Day: Get the Framework to Develop AI Solutions webinar and helping you unlock the full potential of AI for your business! View the full article
  20. **Event Date & Time: May 21, 2025, 1:00 - 2:00 PM GTM+1 UK ** Learn how to effectively plan, build, and deploy generative AI solutions. Join us at Microsoft ISV AI Envisioning Day: Get the Framework to Develop AI Solutions webinar. The course is designed to empower ISVs with the tools and frameworks needed to create a development plan for your preidentified AI use cases. What's in store: Framework Deep Dive: Understand the Capability Envisioning framework—a strategic approach to accelerate AI solution development. Considerations for development plan: Explore the technical and business considerations for creating AI solutions and discover tools designed to help you build transformative generative AI applications. Expert Insights: Learn from leading AI professionals at Microsoft who will share their experiences and best practices. Interactive Q&A: Engage with our experts and get your AI queries answered in real-time. Exclusive Resources: Access a wealth of materials to help you design and develop top-notch AI solutions. After attending, you’ll have practical knowledge on how to apply the framework to create your development plan and build your AI solution. How to Register Register now for the Microsoft ISV AI Envisioning Day: Get the Framework to Develop AI Solutions session. The session will be delivered in English. We look forward to seeing you at the Microsoft ISV AI Envisioning Day: Get the Framework to Develop AI Solutions webinar and helping you unlock the full potential of AI for your business! View the full article
  21. **Event Date & Time: May 21, 2025, 12:00 - 1:00 PM UTC+8 Singapore ** Learn how to effectively plan, build, and deploy generative AI solutions. Join us at Microsoft ISV AI Envisioning Day: Get the Framework to Develop AI Solutions webinar. The course is designed to empower ISVs with the tools and frameworks needed to create a development plan for your preidentified AI use cases. What's in store: Framework Deep Dive: Understand the Capability Envisioning framework—a strategic approach to accelerate AI solution development. Considerations for development plan: Explore the technical and business considerations for creating AI solutions and discover tools designed to help you build transformative generative AI applications. Expert Insights: Learn from leading AI professionals at Microsoft who will share their experiences and best practices. Interactive Q&A: Engage with our experts and get your AI queries answered in real-time. Exclusive Resources: Access a wealth of materials to help you design and develop top-notch AI solutions. After attending, you’ll have practical knowledge on how to apply the framework to create your development plan and build your AI solution. How to Register Register now for the Microsoft ISV AI Envisioning Day: Get the Framework to Develop AI Solutions session. The session will be delivered in English. We look forward to seeing you at the Microsoft ISV AI Envisioning Day: Get the Framework to Develop AI Solutions webinar and helping you unlock the full potential of AI for your business! View the full article
  22. **Event Date & Time: June 3, 2025, 1:00 - 2:00 PM GMT+1 London** Join us at Microsoft ISV AI Envisioning Day: Identify and Prioritize Use Cases for AI Solutions to explore exciting AI business opportunities and learn how to craft a comprehensive development plan for your AI solutions. This session empowers businesses to drive growth by introducing an envisioning framework to help you identify and prioritize the most impactful AI use cases for solution development. What to Expect During the Business Envisioning webinar, we will walk you through a comprehensive framework that will enable you to create a development plan for building AI applications. The session will cover: Business Envisioning: Understand how to identify and prioritize business use cases for AI solutions. Learn the importance of aligning AI initiatives with your business goals to maximize value. The Business, User Experience, and Technology Prioritization Framework: The business, experience, technology (BXT) framework enables ISVs to evaluate the potential of their use cases. See how this exercise helps ISVs structure the details of each use case to better evaluate their viability. Use Case Prioritization: Employ the BXT rankings as an agile method to evaluate and differentiate the value and learning opportunities of each use case being considered and then implement prioritization accordingly. Why Attend? By attending the Business Envisioning webinar, you will: Gain Valuable Insights: Learn from Microsoft experts about the latest trends and best practices in AI application development. Network with Peers: Connect with other ISVs and share experiences, challenges, and solutions. Accelerate Your AI Journey: Get practical advice and actionable steps to kickstart your AI projects and bring them to market faster. How to Register Register now for the Microsoft ISV AI Envisioning Day: Identify and Prioritize Use Cases for AI Solutions session. The session will be delivered in English. We look forward to seeing you at the Microsoft ISV AI Envisioning Day: Identify and Prioritize Use Cases for AI Solutions webinar and helping you unlock the full potential of AI for your business! View the full article
  23. **Event Date & Time: June 3, 2025, 12:00 - 1:00 PM UTC+8 Singapore** Join us at Microsoft ISV AI Envisioning Day: Identify and Prioritize Use Cases for AI Solutions to explore exciting AI business opportunities and learn how to craft a comprehensive development plan for your AI solutions. This session empowers businesses to drive growth by introducing an envisioning framework to help you identify and prioritize the most impactful AI use cases for solution development. What to Expect During the Business Envisioning webinar, we will walk you through a comprehensive framework that will enable you to create a development plan for building AI applications. The session will cover: Business Envisioning: Understand how to identify and prioritize business use cases for AI solutions. Learn the importance of aligning AI initiatives with your business goals to maximize value. The Business, User Experience, and Technology Prioritization Framework: The business, experience, technology (BXT) framework enables ISVs to evaluate the potential of their use cases. See how this exercise helps ISVs structure the details of each use case to better evaluate their viability. Use Case Prioritization: Employ the BXT rankings as an agile method to evaluate and differentiate the value and learning opportunities of each use case being considered and then implement prioritization accordingly. Why Attend? By attending the Business Envisioning webinar, you will: Gain Valuable Insights: Learn from Microsoft experts about the latest trends and best practices in AI application development. Network with Peers: Connect with other ISVs and share experiences, challenges, and solutions. Accelerate Your AI Journey: Get practical advice and actionable steps to kickstart your AI projects and bring them to market faster. How to Register Register now for the Microsoft ISV AI Envisioning Day: Identify and Prioritize Use Cases for AI Solutions session. The session will be delivered in English. We look forward to seeing you at the Microsoft ISV AI Envisioning Day: Identify and Prioritize Use Cases for AI Solutions webinar and helping you unlock the full potential of AI for your business! View the full article
  24. The Significance of OAuth 2.0 and OIDC in Contemporary Society. In today's digital landscape, securing user authentication and authorization is paramount. Modern authentication protocols like OAuth 2.0 and OpenID Connect (OIDC) have become the backbone of secure and seamless user experiences. This blog delves into the roles of OAuth 2.0 and OIDC, their request flows, troubleshooting scenarios and their significance in the modern world. Why Oauth 2.0? What problem does it solve? Let's compare Oauth to traditional Forms based Authentication. Aspect OAuth Forms Authentication Password Sharing Eliminates the need for password sharing, reducing credential theft risk. Requires users to share passwords, increasing the risk of credential theft. Access Control Provides granular access control, allowing users to grant specific access to applications. Limited access control, often granting full access once authenticated. Security Measures Enhanced security measures, creating a safer environment for authentication. Susceptible to phishing attacks and credential theft. User Experience Simplifies login processes, enhancing user experience. Can lead to user password fatigue and weak password practices. Credential Storage Does not require storing user credentials, reducing the risk of breaches. Requires secure storage of user credentials, which can be challenging. Session Hijacking Provides mechanisms to prevent session hijacking. Vulnerable to session hijacking, where attackers steal session cookies. OAuth 2.0 Overview OAuth 2.0 is an authorization framework that allows third-party applications to obtain limited access to user resources without exposing user credentials. It provides a secure way for users to grant access to their resources hosted on one site to another site without sharing their credentials. OAuth 2.0 Request Flow Here’s a simplified workflow: Authorization Request: The client application redirects the user to the authorization server, requesting authorization. User Authentication: The user authenticates with the authorization server. Authorization Grant: The authorization server redirects the user back to the client application with an authorization code. Token Request: The client application exchanges the authorization code for an access token by making a request to the token endpoint. Token Response: The authorization server returns the access token to the client application, which can then use it to access protected resources. Let’s take an Example to depict the above Authorization code flow. Consider a front-end .NET core application which is built to make a request to Auth server to secure the token. (i.e. Auth token) the token then will be redeemed to gain access token and passed on to an API to get simple weather details. 1. In program.cs we will have the following code. builder.Services.AddAuthentication(OpenIdConnectDefaults.AuthenticationScheme) .AddMicrosoftIdentityWebApp(builder.Configuration.GetSection("AzureAd")) .EnableTokenAcquisitionToCallDownstreamApi(new string[] { "user.read" }) .AddDownstreamApi("Weather", builder.Configuration.GetSection("Weather")) .AddInMemoryTokenCaches(); The above code configures the application to use Microsoft Identity for authentication, acquire tokens to call downstream APIs, and cache tokens in memory. AddMicrosoftIdentityWebApp This line Registers OIDC auth scheme. It reads the Azure AD settings from the AzureAd section of the configuration file (e.g., appsettings.json). This setup allows the application to authenticate users using Azure Active Directory. EnableTokenAcquisitionToCallDownstreamApi This line enables the application to acquire tokens to call downstream APIs. The user.read scope is specified, which allows the application to read the user's profile information. This is essential for accessing protected resources on behalf of the user. AddDownstreamApi This line configures a downstream API named "Weather". It reads the configuration settings for the Weather API from the Weather section of the configuration file. This setup allows the application to call the Weather API using the acquired tokens. AddInMemoryTokenCaches This line adds an in-memory token cache to the application. Token caching is crucial for improving performance and reducing the number of token requests. By storing tokens in memory, the application can reuse them for subsequent API calls without needing to re-authenticate the user. 2. In applicationsettings.json we will have the following. "AzureAd": { "Instance": "https://login.microsoftonline.com/", "Domain": "Domain name", "TenantId": "Add tenant ID", "ClientId": "Add client ID", "CallbackPath": "/signin-oidc", "Scopes": "user.read", "ClientSecret": "", "ClientCertificates": [] }, In the home controller we can inject the IDownstreamApi field into home default constructor. private IDownstreamApi _downstreamApi; private const string ServiceName = "Weather"; public HomeController(ILogger<HomeController> logger, IDownstreamApi downstreamApi) { _logger = logger; _downstreamApi = downstreamApi; } 3. The following section makes an API call. public async Task<IActionResult> Privacy() { try { var value = await _downstreamApi.CallApiForUserAsync(ServiceName, options => { }); if (value == null) { return NotFound(new { error = "API response is null." }); } value.EnsureSuccessStatusCode(); // Throws if response is not successful string jsonContent = await value.Content.ReadAsStringAsync(); return Content(jsonContent, "application/json"); // Sends raw JSON as is } catch (HttpRequestException ex) { return StatusCode(500, new { error = "Error calling API", details = ex.Message }); } } The above code will make sure to capture the token by making call to Identity provider and forward the redeemed access token (i.e. Bearer token) to the backend Api. 4. Now let’s see the setup at the Web Api: In program.cs we will have the following code snippet. var builder = WebApplication.CreateBuilder(args); // Add services to the container. builder.Services.AddControllers(); builder.Services.AddMicrosoftIdentityWebApiAuthentication(builder.Configuration); builder.Services.AddEndpointsApiExplorer(); builder.Services.AddSwaggerGen(); Followed by Appsettings.json. "AzureAd": { "Instance": "https://login.microsoftonline.com/", "Domain": "Domain name", "TenantId": “Add tenant id", "ClientId": "Add client id.", "CallbackPath": "/signin-oidc", "Scopes": "user.read", "ClientSecret": "", "ClientCertificates": [] }, In the controller we can have the following. namespace APIOauth.Controllers { [Authorize(AuthenticationSchemes = "Bearer")] [ApiController] [Route("[controller]")] public class WeatherForecastController : ControllerBase { private static readonly string[] Summaries = new[] { "Freezing", "Bracing", "Chilly", "Cool", "Mild", "Warm", "Balmy", "Hot", "Sweltering", "Scorching" }; To drill down the request flow let’s capture a fiddler: Step 1: First 2 calls are made by the application to openid-configuration and Keys end points. The first step is crucial as the application requires Open id configuration to know what configuration it has and what are the supported types. Example: Claims supported; scopes_supported, token_endpoint_auth_methods_supported, response mode supported etc… Secondly the keys endpoint provides all the public keys which can later be used to Decrypt the token received. Step 2: Once we have the above config and keys the application now Redirects the user to identity provider with the following parameters. Points to be noted in the above screen is the response_type which is code (Authorization code) and the response_mode is Form_post. Step 3: The subsequent request is the Post requests which will have the token in it. Step 4: In this step we will redeem the auth code with access token. Request is made by attaching the auth code along with following parameters. Response is received with an access token. Step 5: Now the final call is made to the Api along with the access token to get weather details. Request: Response: This completes the Oauth Authorization code flow. Let us now take a moment to gain a brief understanding of JWT tokens. JWTs are widely used for authentication and authorization in modern web applications due to their compact size and security features. They allow secure transmission of information between parties and can be easily verified and trusted. Structure A JWT consists of three parts separated by dots (.), which are: Header: Contains metadata about the type of token and the cryptographic algorithms used. Payload: Contains the claims. Claims are statements about an entity (typically, the user) and additional data. Signature: Ensures that the token wasn't altered. It is created by taking the encoded header, the encoded payload, a secret, the algorithm specified in the header, and signing that. Here is an example of a JWT: OpenID Connect. (OIDC) OIDC Overview OpenID Connect is an authentication layer built on top of OAuth 2.0. While OAuth 2.0 handles authorization, OIDC adds authentication, allowing applications to verify the identity of users and obtain basic profile information. This combination ensures both secure access and user identity verification. OIDC Request Flow OIDC extends the OAuth 2.0 authorization code flow by adding an ID token, which contains user identity information. Here’s a simplified workflow: Authorization Request: The client application redirects the user to the authorization server, requesting authorization and an ID token. User Authentication: The user authenticates with the authorization server. Authorization Grant: The authorization server redirects the user back to the client application with an authorization code. Token Request: The client application exchanges the authorization code for an access token and an ID token by making a request to the token endpoint. Token Response: The authorization server returns the access token and ID token to the client application. The ID token contains user identity information, which the client application can use to authenticate the user. Example: Consider .Net Application which is setup for user Authentication. Let’s see the workflow. Let’s capture a fiddler once again to see the authentication flow: Step 1: & Step 2: would remain same as we saw in Authorization code flow. Making a call to OpenID configuration & making a call to Keys Endpoint. Step 3: Response type here is “ID token” and not a Auth code as we saw in Authorization code flow. This is an implicit flow since we are not redeeming or exchanging an Auth code. Also, an Implicit flow doesn't need a client secret. Step 4: In a post request to browser, we will receive an ID token. This completes the Authentication code flow which will result in getting the ID token to permit the user to the application. Common Troubleshooting Scenarios Implementing OAuth in ASP.NET Core can sometimes present challenges. Here are some common issues and how to address them: 1. Misconfigurations Misconfigurations can lead to authentication failures and security vulnerabilities. For example, loss of internet connection or incorrect settings in the OAuth configuration can disrupt the authentication process. One example which we have faced is servers placed in “DMZ” with no internet access. Server need to make an outbound call to login.microsoft.com or identity provider for getting the metadata for openId/Oauth. 2. Failures due to server farm setup. Loss of saving Data protection keys on different workers. Data protection is used to protect Cookies. For server farm the data protection keys should be persisted and shared. One common issue with data protection keys in OAuth flow is the synchronization of keys across different servers or instances. If the keys are not synchronized correctly, it can result in authentication failures and disrupt the OAuth flow. 3. Token Expiration Token expiration can disrupt user sessions and require re-authentication, which can frustrate users. It's essential to implement token refresh functionality to enhance user experience and security. 4. Redirect URI Mismatches Redirect URI mismatches can prevent applications from receiving authorization cods, causing login failures. Ensure that the redirect URI specified in the identity provider’s settings matches the one in your application. 5. Scope Misconfigurations Improperly configured scopes can result in inadequate permissions and restrict access to necessary resources. It's crucial to define the correct scopes to ensure that applications have the necessary permissions to access resources. By understanding these common pitfalls and implementing best practices, developers can successfully integrate OAuth into their ASP.NET Core applications, ensuring a secure and seamless user experience. I hope it helps! View the full article
  25. **Event Date & Time: June 3, 2025, 10:00 - 11:00 AM PST ** Join us at Microsoft ISV AI Envisioning Day: Identify and Prioritize Use Cases for AI Solutions to explore exciting AI business opportunities and learn how to craft a comprehensive development plan for your AI solutions. This session empowers businesses to drive growth by introducing an envisioning framework to help you identify and prioritize the most impactful AI use cases for solution development. What to Expect During the Business Envisioning webinar, we will walk you through a comprehensive framework that will enable you to create a development plan for building AI applications. The session will cover: Business Envisioning: Understand how to identify and prioritize business use cases for AI solutions. Learn the importance of aligning AI initiatives with your business goals to maximize value. The Business, User Experience, and Technology Prioritization Framework: The business, experience, technology (BXT) framework enables ISVs to evaluate the potential of their use cases. See how this exercise helps ISVs structure the details of each use case to better evaluate their viability. Use Case Prioritization: Employ the BXT rankings as an agile method to evaluate and differentiate the value and learning opportunities of each use case being considered, and then implement prioritization accordingly. Why Attend? By attending the Business Envisioning webinar, you will: Gain Valuable Insights: Learn from Microsoft experts about the latest trends and best practices in AI application development. Network with Peers: Connect with other ISVs and share experiences, challenges, and solutions. Accelerate Your AI Journey: Get practical advice and actionable steps to kickstart your AI projects and bring them to market faster. How to Register Register now for the Microsoft ISV AI Envisioning Day: Identify and Prioritize Use Cases for AI Solutions session. The session will be delivered in English. We look forward to seeing you at the Microsoft ISV AI Envisioning Day: Identify and Prioritize Use Cases for AI Solutions webinar and helping you unlock the full potential of AI for your business! View the full article
×
×
  • Create New...