-
Posts
5694 -
Joined
-
Last visited
Content Type
Profiles
Forums
Events
Articles
Downloads
Everything posted by Windows Server
-
Greetings pilots, and welcome to another pioneering year of AI innovation with Security Copilot. Find out how your organization can reach new heights with Security Copilot through the many exciting announcements on the way at both Microsoft Secure and RSA 2025. This is why now is the time to familiarize yourself and get airborne with Security Copilot. Go to School Microsoft Security Copilot Flight School is a comprehensive series charted to take students through fundamental concepts of AI definitions and architectures, take flight with prompting and automation, and hit supersonic speeds with Logic Apps and custom plugins. By the end of the course, students should be equipped with the requisite knowledge for how to successfully operate Security Copilot to best meet their organizational needs. The series contains 11 episodes with each having a flight time of around 10 minutes. Security Copilot is something I really, really enjoy, whether I’m actively contributing to its improvement or advocating for the platform’s use across security and IT workflows. Ever since I was granted access two years ago – which feels like a millennium in the age of AI – it’s been a passion of mine, and it’s why just recently I officially joined the Security Copilot product team. This series in many ways reflects not only my passion but similar passion found in my marketing colleagues Kathleen Lavallee (Senior Product Marketing Manager, Security Copilot) Shirleyse Haley (Senior Security Skilling Manager), and Shateva Long (Product Manager, Security Copilot). I hope that you enjoy it just as much as we did making it. Go ahead, and put on your favorite noise-cancelling headphones, it’s time, pilots, to take flight. Log Flight Hours There are two options for watching Security Copilot Flight School: either on Microsoft Learn or via the Youtube Playlist found on the Microsoft Security Youtube Channel. The first two episodes focus on establishing core fundamentals of Security Copilot platform design and architecture – or perhaps attaining your instrument rating. The episodes thereafter are plotted differently, around a standard operating procedure. To follow the ideal flight path Security Copilot should be configured and ready to go – head over to MS Learn and the Adoption Hub to get airborne. It’s also recommended that pilots watch the series sequentially, and be prepared to follow along with resources found on Github, to maximize learning and best align with the material. This will mean that you’ll need to coordinate with a pilot with owner permissions for your instance to create and manipulate the necessary resources. Episode 1 - What is Microsoft Security Copilot? Security is complex and requires highly specialized skills to face the challenges of today. Because of this, many of the people working to protect an organization work in silos that can be isolated from other business functions. Further, enterprises are highly fragmented environments with esoteric systems, data, and processes. All of which takes a tremendous amount of time, energy, and effort just to do the day-to-day. Security Copilot is a cloud-based, AI-powered security platform that is designed to address the challenges presented by complex and fragmented enterprise environments by redefining what security is and how security gets done. What is AI, and why exactly should it be used in a cybersecurity context? Episode 2 - AI Orchestration with Microsoft Security Copilot Why is The Paper Clip Pantry a 5-star restaurant renowned the world over for its Wisconsin Butter Burgers? Perhaps it’s how a chef uses a staff with unique skills and orchestrates the sourcing of resources in real time, against specific contexts to complete an order. After watching this episode you’ll understand how AI Orchestration works, why nobody eats a burger with only ketchup, and how the Paper Clip Pantry operates just like the Security Copilot Orchestrator. Episode 3 – Standalone and Embedded Experiences Do you have a friend who eats pizza in an inconceivable way? Maybe they eat a slice crust-first, or dip it into a sauce you never thought compatible with pizza? They work with pizza differently, just like any one security workflow could be different from one task team, or individual to the next. This philosophy is why Security Copilot has two experiences – solutions embedded within products, and a standalone portal – to augment workflows no matter their current state. This episode will begin covering those experiences. Episode 4 – Other Embedded Experiences Turns out you can also insist upon putting cheese inside of pizza crust, or bake it thick enough as to require a fork and knife. I imagine, it’s probably something Windows 95 Man would do. In this episode, the Microsoft Entra, Purview, Intune, and Microsoft Threat Intelligence products showcase how Security Copilot advances their workflows within their portals. Beyond baking in the concepts of many workflows, many operators, the takeaway from this episode is that Security Copilot works with security adjacent workflows – IT, Identity, and DLP. Episode 5 – Manage Your Plugins Like our chef in The Paper Clip Pantry, we should probably define what we want to cook, what chefs to use, and set permissions for those that can interact within any input or output from the kitchen. Find out what plugins add to Security Copilot and how you can set plugin controls for your team and organization. Episode 6 – Prompting Is this an improv lesson, or a baking show? Or maybe if you watch this episode, you’ll learn how Security Copilot handles natural language inputs to provide you meaningful answers know as responses. Episode 7 – Prompt Engineering With the fundamentals of prompting in your flight log, it’s time to soar a bit higher with prompt engineering. In this episode you will learn how to structure prompts in a way to maximize the benefits of Security Copilot and begin building workflows. Congrats, pilot, your burgers will no longer come with just ketchup. Episode 8 – Using Promptbooks What would it look like to find a series of prompts and run them, in the same sequence with the same output every time? You guessed it, a promptbook, a repeatable workflow in the age of AI. See where to access promptbooks within the platform, and claw back some of your day to perfect your next butter burger. Episode 9 – Custom Promptbooks You’ve been tweaking your butter burger recipe for months now. You’ve finally landed at the perfect version by incorporating a secret nacho cheese recipe. The steps are defined, the recipe perfect. How do you repeat it? Just like your butter burger creation, you might discover or design workflows with Security Copilot. With custom promptbooks you can repeat and share them across your organization. In this episode you’ll learn about the different ways Security Copilot helps you develop your own custom AI workflows. Episode 10 – Logic Apps System automation, robot chefs? Actions? What if customers could order butter burgers with the click of a button, and the kitchen staff would automatically make one? Or perhaps every Friday at 2pm a butter burger was just delivered to you? Chances are there are different conditions across your organization that when present requires a workflow to being. With Logic Apps, Security Copilot can be used to automatically aid workflows across any system a Logic App can connect to. More automation, less mouse clicking, that’s a flight plan everyone can agree on. Episode 11 – Extending to Your Ecosystem A famed restaurant critic stopped into the The Paper Clip Pantry butter burger, and it’s now the burger everyone is talking about. Business is booming and it's time to expand the menu – maybe a butter burger pizza, perhaps a doughnut butter burger? But you’ll need some new recipes and sources of knowledge to achieve this. Like a food menu the possibilities of expanding Security Copilot’s capabilities are endless. In this episode learn how this can be achieved with custom plugins and knowledgebases. Once you have that in your log, you will be a certified Ace, and ready to take flight with Security Copilot. Take Flight I really hope that you not only learn something new but have fun taking flight with the Security Copilot Flight School. As with any new and innovative technology, the learning never stops, and there will be opportunities to log more flight hours from our expert flight crews. Stay tuned at the Microsoft Security Copilot video hub, Microsoft Secure, and RSA 2025 for more content in the next few months. If you think it’s time to get the rest of your team and/or organization airborne there’s check out the Security Copilot adoption hub to get started: aka.ms/SecurityCopilotAdoptionHub Other Resources Our teams have been hard at work building solutions to extend Security Copilot, you can find them on our community Github page found at: aka.ms/SecurityCopilotGitHubRepo To stay close to the latest in product news, development, and to interact with our engineering teams, please join the Security Copilot CCP to get the latest information: aka.ms/JoinCCP View the full article
-
The Future of AI blog series is an evolving collection of posts from the AI Futures team in collaboration with subject matter experts across Microsoft. In this series, we explore tools and technologies that will drive the next generation of AI. Explore more at: https://aka.ms/the-future-of-ai Customizing AI agents with the Semantic Kernel agent framework AI agents are autonomous entities designed to solve complex tasks for humans. Compared to traditional software agents, AI-powered agents allow for more robust solutions with less coding. Individual AI agents have shown significant capabilities, achieving results previously not possible. The potential of these agents is enhanced when multiple specialized agents collaborate within a multi-agent system. Research has shown that such systems, comprising single-purpose agents, are more effective than single multi-purpose agents in many tasks [1]. This enables automation of more complex workflows with improved results and higher efficiency in the future. In this post, we are going to explore how you can build single agents and multi-agent systems with Semantic Kernel. Semantic Kernel is a lightweight and open-source SDK developed by Microsoft, designed to facilitate the creation of production-ready AI solutions. Despite its capabilities, Semantic Kernel remains accessible, allowing developers to start with minimal code. For scalable deployment, it offers advanced features such as telemetry, hooks, and filters to ensure the delivery of secure and responsible AI solutions. The Semantic Kernel Agent Framework offers pro-code orchestration within the Semantic Kernel ecosystem, facilitating the development of AI agents and agentic patterns capable of addressing more complex tasks autonomously. Starting with individual agents is recommended. Semantic Kernel provides a variety of AI service connectors, allowing developers and companies to select models from different providers or even local models. Additionally, Semantic Kernel gives developers the flexibility to integrate their agents created from managed services like Azure OpenAI Service Assistant API and Azure AI Agent Service into a unified system. Refer to the samples in the Semantic Kernel GitHub repository to get you started. Python: semantic-kernel/python/samples/getting_started_with_agents at main · microsoft/semantic-kernel .Net: semantic-kernel/dotnet/samples/GettingStartedWithAgents at main · microsoft/semantic-kernel Previous posts have thoroughly examined the principles of designing single agents and the effectiveness of multi-agent systems. The objective of this post is not to determine when a single agent should be employed versus a multi-agent system; however, it is important to emphasize that agents should be designed with a single purpose to maximize their performance. Assigning multiple responsibilities or capabilities to a single agent is likely to result in suboptimal outcomes. If your tasks can be efficiently accomplished by a single agent, that’s great! If you find that the performance of a single agent is unsatisfactory, you might consider employing multiple agents to collaboratively address your tasks. Our recent Microsoft Mechanics video outlines how a multi-agent system operates. Semantic Kernel offers a highly configurable chat-based agentic pattern, with additional patterns coming soon. It accommodates two or more agents and supports custom strategies to manage the flow of chat, enhancing the system’s dynamism and overall intelligence. Semantic Kernel is production-ready with built-in features that are off by default but available when needed. One such feature is observability. Often in an agentic application, agent interactions were not shown in the output, which is typical since users often focus on results. Nonetheless, being able to inspect the inner process is crucial to developers. Tracking interactions becomes challenging as the number of agents increases and tasks grow complex. Semantic Kernel can optionally emit telemetry data to ease debugging. For a demonstration of three agents collaborating in real-time and reviewing the agent interactions with the tracing UI in Azure AI Foundry portal, please watch the following video demo: The code to the demo can be found in a single demo app in the Semantic Kernel repository: semantic-kernel/python/samples/demos/document_generator at main · microsoft/semantic-kernel In summary, Semantic Kernel offers an efficient framework for both single and multi-agent systems. As the platform evolves, it promises even more innovative patterns and capabilities, solidifying its role in agent-based AI. Whether for simple tasks or complex projects, Semantic Kernel provides the necessary tools to achieve your goals effectively. Happy coding! To get started, Explore Azure AI Foundry models, agentic frameworks, and toolchain features Begin coding using the Semantic Kernel python repository in GitHub Download the Azure AI Foundry SDK Review our Learn documentation View the full article
-
Demo: Mpesa for Business Setup QA RAG Application In this tutorial we are going to build a Question-Answering RAG Chat Web App. We utilize Node.js and HTML, CSS, JS. We also incorporate Langchain.js + Azure OpenAI + MongoDB Vector Store (MongoDB Search Index). Get a quick look below. Note: Documents and illustrations shared here are for demo purposes only and Microsoft or its products are not part of Mpesa. The content demonstrated here should be used for educational purposes only. Additionally, all views shared here are solely mine. What you will need: An active Azure subscription, get Azure for Student for free or get started with Azure for 12 months free. VS Code Basic knowledge in JavaScript (not a must) Access to Azure OpenAI, click here if you don't have access. Create a MongoDB account (You can also use Azure Cosmos DB vector store) Setting Up the Project In order to build this project, you will have to fork this repository and clone it. GitHub Repository link: https://github.com/tiprock-network/azure-qa-rag-mpesa . Follow the steps highlighted in the README.md to setup the project under Setting Up the Node.js Application. Create Resources that you Need In order to do this, you will need to have Azure CLI or Azure Developer CLI installed in your computer. Go ahead and follow the steps indicated in the README.md to create Azure resources under Azure Resources Set Up with Azure CLI. You might want to use Azure CLI to login in differently use a code. Here's how you can do this. Instead of using az login. You can do az login --use-code-device OR you would prefer using Azure Developer CLI and execute this command instead azd auth login --use-device-code Remember to update the .env file with the values you have used to name Azure OpenAI instance, Azure models and even the API Keys you have obtained while creating your resources. Setting Up MongoDB After accessing you MongoDB account get the URI link to your database and add it to the .env file along with your database name and vector store collection name you specified while creating your indexes for a vector search. Running the Project In order to run this Node.js project you will need to start the project using the following command. npm run dev The Vector Store The vector store used in this project is MongoDB store where the word embeddings were stored in MongoDB. From the embeddings model instance we created on Azure AI Foundry we are able to create embeddings that can be stored in a vector store. The following code below shows our embeddings model instance. //create new embedding model instance const azOpenEmbedding = new AzureOpenAIEmbeddings({ azureADTokenProvider, azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_API_INSTANCE_NAME, azureOpenAIApiEmbeddingsDeploymentName: process.env.AZURE_OPENAI_API_DEPLOYMENT_EMBEDDING_NAME, azureOpenAIApiVersion: process.env.AZURE_OPENAI_API_VERSION, azureOpenAIBasePath: "https://eastus2.api.cognitive.microsoft.com/openai/deployments" }); The code in uploadDoc.js offers a simple way to do embeddings and store them to MongoDB. In this approach the text from the documents is loaded using the PDFLoader from Langchain community. The following code demonstrates how the embeddings are stored in the vector store. // Call the function and handle the result with await const storeToCosmosVectorStore = async () => { try { const documents = await returnSplittedContent() //create store instance const store = await MongoDBAtlasVectorSearch.fromDocuments( documents, azOpenEmbedding, { collection: vectorCollection, indexName: "myrag_index", textKey: "text", embeddingKey: "embedding", } ) if(!store){ console.log('Something wrong happened while creating store or getting store!') return false } console.log('Done creating/getting and uploading to store.') return true } catch (e) { console.log(`This error occurred: ${e}`) return false } } In this setup, Question Answering (QA) is achieved by integrating Azure OpenAI’s GPT-4o with MongoDB Vector Search through LangChain.js. The system processes user queries via an LLM (Large Language Model), which retrieves relevant information from a vectorized database, ensuring contextual and accurate responses. Azure OpenAI Embeddings convert text into dense vector representations, enabling semantic search within MongoDB. The LangChain RunnableSequence structures the retrieval and response generation workflow, while the StringOutputParser ensures proper text formatting. The most relevant code snippets to include are: AzureChatOpenAI instantiation, MongoDB connection setup, and the API endpoint handling QA queries using vector search and embeddings. There are some code snippets below to explain major parts of the code. Azure AI Chat Completion Model This is the model used in this implementation of RAG, where we use it as the model for chat completion. Below is a code snippet for it. const llm = new AzureChatOpenAI({ azTokenProvider, azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_API_INSTANCE_NAME, azureOpenAIApiDeploymentName: process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME, azureOpenAIApiVersion: process.env.AZURE_OPENAI_API_VERSION }) Using a Runnable Sequence to give out Chat Output This shows how a runnable sequence can be used to give out a response given the particular output format/ output parser added on to the chain. //Stream response app.post(`${process.env.BASE_URL}/az-openai/runnable-sequence/stream/chat`, async (req,res) => { //check for human message const { chatMsg } = req.body if(!chatMsg) return res.status(201).json({ message:'Hey, you didn\'t send anything.' }) //put the code in an error-handler try{ //create a prompt template format template const prompt = ChatPromptTemplate.fromMessages( [ ["system", `You are a French-to-English translator that detects if a message isn't in French. If it's not, you respond, "This is not French." Otherwise, you translate it to English.`], ["human", `${chatMsg}`] ] ) //runnable chain const chain = RunnableSequence.from([prompt, llm, outPutParser]) //chain result let result_stream = await chain.stream() //set response headers res.setHeader('Content-Type','application/json') res.setHeader('Transfer-Encoding','chunked') //create readable stream const readable = Readable.from(result_stream) res.status(201).write(`{"message": "Successful translation.", "response": "`); readable.on('data', (chunk) => { // Convert chunk to string and write it res.write(`${chunk}`); }); readable.on('end', () => { // Close the JSON response properly res.write('" }'); res.end(); }); readable.on('error', (err) => { console.error("Stream error:", err); res.status(500).json({ message: "Translation failed.", error: err.message }); }); }catch(e){ //deliver a 500 error response return res.status(500).json( { message:'Failed to send request.', error:e } ) } }) To run the front end of the code, go to your BASE_URL with the port given. This enables you to run the chatbot above and achieve similar results. The chatbot is basically HTML+CSS+JS. Where JavaScript is mainly used with fetch API to get a response. Thanks for reading. I hope you play around with the code and learn some new things. Additional Reads Introduction to LangChain.js Create an FAQ Bot on Azure Build a basic chat app in Python using Azure AI Foundry SDK View the full article
-
Hi All I hope you are well. Anyway, on Android Enterprise Fully Managed devices, I have an ask to to enforce a No PIN No Device Access policy. These devices have the usual, where the PIN requirements are set with a device config policy and then checked with a corresponding compliance policy. But no where can I see "restrict use of the device til a PIN is set" setting. Perhaps it's really obvious but is this possible? Only obvious option I can is in the compliance policy settings on Actions for noncompliance as below: Would this be the appropriate setting or are there others? And if the device is locked, is the user able to set a PIN? Info appreciated. SK View the full article
-
Video content has become an essential medium for communication, learning, and marketing. Microsoft 365 Copilot, combined with the Visual Creator Agent, is redefining the way professionals create videos. By leveraging AI-driven automation, users can generate high-quality videos with minimal effort. In this blog, we’ll explore how the Visual Creator Agent works within Microsoft 365 Copilot, its key features, and how you can use it to streamline video production. Full details in this blog https://dellenny.com/generating-videos-in-microsoft-365-copilot-using-visual-creator-agent/ View the full article
-
Kirk Koenigsbauer is the COO of the Experiences + Devices division at Microsoft AI adoption is already happening in the workplace, but employees aren’t waiting for an official rollout. Our most recent Work Trend Index shows that 75% of employees are using AI at work, and 78% of them are bringing their own AI tools – largely consumer grade tools. This surge in AI adoption reflects clear demand for productivity gains, but unmanaged and often unsecured tools create real security and compliance risks. No organization wants confidential information inadvertently exposed or used to train external AI models. Leaders recognize the need for a secure, enterprise-wide AI solution that meets employee demand while ensuring data protection. However, some customers we meet with want to study ROI benefits before committing to a full AI subscription for every employee. That’s why we introduced Microsoft 365 Copilot Chat in January 2025. Copilot Chat provides free, secure AI chat powered by GPT-4o, giving organizations an immediate and compliant alternative to consumer AI tools. Employees get powerful AI access while IT retains control—without requiring additional subscription commitments. With enterprise-grade security, built-in compliance, and flexible pay-as-you-go AI agents, Copilot Chat allows organizations to experiment, scale, and validate AI’s impact. By offering employees a secure, discoverable, and powerful AI experience, organizations can embrace AI on their own terms—ensuring productivity gains without sacrificing security or overcommitting budgets. Copilot Chat is free* AI service for your whole organization Copilot Chat helps employees in every role work smarter and accomplish more. When they need to do Internet-based research to get their job done, they should use Copilot Chat to get up-to-date summarized insights with speed and accuracy, without leaking sensitive information outside the company data boundary. And that’s not all – employees can easily summarize, rewrite or get insights from files or documents by simply uploading them in chat and prompting Copilot. Enterprise data protection applies to prompts, responses and uploaded files, and they are stored securely to protect your organization’s data. Copilot Chat also offers Pages, a persistent, digital canvas within the chat experience that lets employees collaborate with Copilot to create durable business artifacts. Copilot is the UI for AI Copilot is your UI for AI—a single, user-friendly entry point where employees can readily access AI-powered agents without needing specialized technical knowledge. These agents help employees save time, boost productivity, and streamline daily tasks. Now, with Copilot Chat, these agents are available to all employees, even without a Microsoft 365 Copilot license—ensuring that AI-powered assistance is accessible across the organization. Employees can use agents to automate repetitive tasks, retrieve information from SharePoint and connected data sources, and support specialized workflows like customer service or field troubleshooting. They can also build their own agents using Agent Builder in Copilot Chat, while IT admins can create and manage organization-wide agents through Copilot Studio. With flexible pay-as-you-go access, organizations can integrate AI-powered automation at their own pace, deploying agents where they drive the most impact. Agents are priced based on metered consumption and can be managed through the Power Platform Admin Center or, coming soon, the Microsoft 365 Admin Center—see our documentation for more information. As businesses refine their AI strategy, they can easily scale usage and expand to full Microsoft 365 Copilot capabilities to maximize value. Enterprise-grade security, compliance, and privacy Copilot Chat offers enterprise data protection. That means it protects your data with encryption at rest and in transit, offers rigorous security controls, and maintains data isolation between tenants. Copilot Chat prompts and responses are protected by the same contractual terms and commitments widely trusted by our customers for their emails in Exchange and files in SharePoint, including support for GDPR, EUDB support, and our Data Protection Addendum. Prompts and responses are logged, can be managed with retention policies, and can be included in eDiscovery and other Purview capabilities. We also help safeguard against AI-focused risks such as harmful content and prompt injections. For content copyright concerns, we provide protected material detection and our Customer Copyright Commitment. Additionally, Copilot Chat offers granular controls and visibility over web grounded search, which enhances responses from the latest data from the web. Furthermore, you can have confidence that Copilot Chat is fully supported - just as you would expect from Microsoft’s enterprise services. Bringing Copilot Chat to your Organization Customers can start with either the free or paid experience in the Microsoft 365 Copilot app, available at M365Copilot.com or in the Windows, Android, or iPhone app stores. To help your organization get the most out of the new Microsoft 365 Copilot Chat, we’ve updated the Copilot Success Kit and added the new Copilot Chat and Agent Starter Kit. This includes: Copilot Chat and agent overview guide to help your users learn how to use agents to make Copilot Chat even more personalized and intelligent for their daily work. Copilot Chat and agent IT setup and controls guide to plan, deploy, manage, and measure Microsoft 365 Copilot Chat in your organization. User engagement templates in a variety of formats—including email, Viva Engage, and Teams—that you can leverage to communicate updates and new features to your users. *Available at no additional cost for all Entra account users with a Microsoft 365 subscription View the full article
-
I have been loooking into mapping best practices about configuring hardening / tiering model from on-premises Active Directory to Microsoft Entra Domain Services (MEDS). I'm well aware that MEDS is NOT a replacemenet for AD DS and have many restrictions and missing features, but that does not stop me from wanting to make it as secure as possible for member servers to be joined to. Since MEDS is a PaaS in Azure, deployed from within Azure and managed in another way than Active Directory, of course there are different ways of implementering a good tiering model. In my study I wanted to see if I could enable Protected Users feature (join users to Protected Users Group). However I find this group to be present but not possible to add members to (feature greyed out). I have a member server in the MEDS instance and have installed AD DS Tools. My user is member of AD DDS Administrators group. I would like to know if anyone have some knowledge on the subject to share? View the full article
-
This article describes how to create a report about group-based licensing assignments and any errors that might have occured. The code uses the Microsoft Graph PowerShell SDK to fetch information about the groups used for licensing assignments, interpret the assignments, find users with assignment errors, and send email to inform administrators about what's been found. https://practical365.com/group-based-licensing-report-email/ View the full article
-
Hi folks - Mike Hildebrand here. Welcome to spring in the US - and another daylight-savings clock-change cycle for many of us (I find it odd that we just 'change time'). Lately, I've been having conversations with customers about 'custom image' support in Windows 365. Like most aspects of IT, an image management system for standardized PC deployments can range from the simple ('Next > Next > Finish') up to the very complex (tiers, workflows and automations). Here's my walk-through of the 'dip a toe in the pool' method for trying out the custom image capabilities of Windows 365. I shared a version of this guidance with customers and colleagues, and it was suggested that I share it with the masses ... so here you go. Step 1 - Create a VM in Azure I keep it plain and simple; a ‘disposable’ VM Start with a ‘Marketplace’ W365 Cloud PC image with the M365 Apps These have optimizations to ensure the best remoting experiences NOTE: I leave off monitoring agents, boot diagnostics, redundancy settings, etc. TIP: Consider creating a ‘dedicated’ new Resource Group for a given image process This makes cleaning up and reducing costs afterwards simple (which I'll cover at the end) IMPORTANT NOTE: When making/using an initial VM as the source for the custom image, ensure “Standard” is chosen for ‘Security type’ “Trusted launch virtual machine” is the default - and won’t work for this process - AND it CANNOT be reverted on a deployed VM Step 2 - Customize it; prep it Once the VM is created, login to it, customize it and then Sysprep it Apps, patches, customizations, local policy, etc. OOBE + ‘Generalize’ + ‘Shutdown’ NOTE: Sysprep may error out - issues such as Bitlocker being enabled, or an issue w/ one or more Store apps can cause this. If it happens, check the log, as indicated on your VM For the apps issue, a PS command similar to this resolves it for me, but check the specific log on your VM for the details: Get-AppxPackage *Microsoft.Ink.Handwriting.Main.* | Remove-AppxPackage Step 3 - Capture it Make sure the VM is stopped (it should be), then ‘Capture’ the image from the portal: The 'Subscription' you select (below) needs to be the same one as where your Windows 365 service lives Select ‘No, capture only a managed image.’ NOTE: In my lab, the image creation process takes around 15 minutes for a simple VM Step 4 - Import it Once the image is created, open the Intune portal and add it to Windows 365 via the 'Custom images' tab TIP: In my lab, the image import takes around 45 minutes NOTE: Up to 20 images can be stored here NOTE: the ‘Subscription’ you select (below) must match where you captured the image (above) or it won’t show up in the ‘Source image’ drop-down Step 5 - Use it After the image is imported into the Windows 365 service, it can be chosen from the Custom Image option in the Provisioning Policy wizard NOTE: If you attempt to delete a 'Custom image' that is configured in a Provisioning Policy, the deletion will fail NOTE: You can edit a Provisioning Policy and update/change the image, but that change will only affect new Cloud PCs provisioned from the Policy - it won't affect existing Cloud PCs spawned from that Policy. Cleanup The VM, disk, IP, etc. and the 'Managed image' you created/captured above will incur costs in Azure - but not the 'Custom image' you uploaded to Intune/W365 (image storage there is included as part of the service). After you import the 'Custom image' to W365, you can/should consider deleting the Resource Group you created in Step 1 (which contains everything associated with your disposable VM – the VM itself, the disk, NIC, the image you captured, etc.). !!! HUGE warning - triple-dog-verify the Resource Group before you delete it !!! Cheers folks! Hilde View the full article
-
Hi all. What sounds like i should be simple is turning out to be not. In a domain environment, we naturally have lots of file shares. We want these shares to live on Share Point now, not on local servers. I can copy the data using Share Point Migration Tool, that bit is fine, we can also create Share Point sites for each share, set permissions on those sites no problem. How do we get it so that when a user logs into a domain PC, they automatically get those Share Point document libraries mapped in This PC? View the full article
-
Hi, I have an old PC currently running Windows 11 22H2 and want to update it to 24H2. The issue is that this PC is not fully supported by Windows 11 as the CPU is unsupported and does not have TPM 2.0 chip. There is no update received on my computer so I have to install Windows 11 24H2 manually on this unsupported PC. By the way, it could be great to keep the apps and personal files. So I prefer an in-place upgrade to 24H2 other than clean install from USB drive. Is there any in-place upgrade solution to update Windows 11 22H2 to 24H2? Much appreciated if you could let me know how to do that. Thank you View the full article
-
Whether you consider yourself a FinOps practitioner, someone who's enthusiastic about driving cloud efficiency and maximizing the value you get from the cloud or were just asked to look at ways to reduce cost, the FinOps toolkit has something for you. This month, you'll find a complete refresh of Power BI with a new design, greatly improved performance, and the ability to calculate reservation savings for both EA and MCA; FinOps hubs have a new Data Explorer dashboard and simpler public networking architecture; and many more small updates and improvements across the board. Read on for details! In this update: New to the FinOps toolkit Website refresh with documentation on Microsoft Learn Power BI report design refresh Calculating savings for both EA and MCA accounts Performance improvements for Power BI reports Important note for organizations that spend over $100K New Data Explorer dashboard for FinOps hubs About the FinOps hubs data model Simplified network architecture for public routing Managing exports and hubs with PowerShell Other new and noteworthy updates Thanking our community What's next New to the FinOps toolkit? In case you haven't heard, the FinOps toolkit is an open-source collection of tools and resources that help you learn, adopt, and implement FinOps in the Microsoft Cloud. The foundation of the toolkit is the Implementing FinOps guide that helps you get started with FinOps whether you're using native tools in the Azure portal, looking for ways to automate and extend those tools, or if you're looking to build your own FinOps tools and reports. To learn more about the toolkit, how to provide feedback, or how to contribute, see FinOps toolkit documentation. Website refresh with documentation on Microsoft Learn Before we get into each of the tool updates, I want to take a quick moment to call out an update to the FinOps toolkit website, which many of you are familiar with. Over the last few months, you may have noticed that we started moving documentation to Microsoft Learn. With that content migration final, we simplified the FinOps toolkit website to provide high-level details about each of the tools with links out to the documentation as needed. Nothing major here, but it is a small update that we hope will help you find the most relevant content faster. If you find there's anything we can do to streamline discovery of information or improve the site in general, please don't hesitate to let us know! And, as an open-source project, we're looking for people who have React development experience to help us expand this to include deployment and management experiences as well. If interested in this or any contribution, please email us at ftk-support@microsoft.com to get involved. Power BI report design refresh In the 0.8 release, Power BI reports saw some of the most significant updates we've had in a while. The most obvious one is the visual design refresh, which anyone who used the previous release will be able to spot immediately after opening the latest reports. The new reports align with the same design language we use in the Azure portal to bring a consistent, familiar experience. This starts on the redesigned Get started page for each report. The Get started page helps set context on what the report does and how to set it up. Select the Connect your data button for details about how to configure the report, in case you either haven't already set it up or need to make a change. If you run into any issues, select the Get help button at the bottom-right of the page for some quick troubleshooting steps. This provides some of the same steps as you'll find in the new FinOps toolkit help + support page. Moving past the Get started page, you'll also see that each report page was updated to move the filters to the left, making a little more room for the main visuals. As part of this update, we also updated all visuals across both the storage and KQL reports to ensure they both have the latest and greatest changes. I suppose the last thing I should call out is that every page now includes a “Give feedback” link. I'd like to encourage you to submit feedback via these links to let us know what works well and what doesn't. The feedback we collect here is an important part of how we plan and prioritize work. Alternatively, you're also welcome to create and vote on issues in our GitHub repository. Each release we'll strive to address at least one of the top 10 feedback requests, so this is a great way to let us know what's most important to you! Calculating savings for both EA and MCA accounts If you've ever tried to quantify cost savings or calculate Effective Savings Rate (ESR), you probably know list and contracted cost are not always available in Cost Management. Now, in FinOps toolkit 0.8, you can add these missing prices in Power BI to facilitate a more accurate and complete savings estimate. Before I get into the specifics, I should note that there are 3 primary ways to connect your data to FinOps toolkit Power BI reports. You can connect reports: Directly to FOCUS data exported to a storage account you created. To a FinOps hub storage account ingestion container. To a FinOps hub Data Explorer cluster. Each option provides additive benefits where FinOps hubs with Data Explorer offers the best performance, scalability, and functionality, like populating missing prices to facilitate cost savings calculations. This was available in FinOps hubs 0.7, so anyone who deployed FinOps hubs with Data Explorer need only export price sheets to take advantage of the feature. Unfortunately, storage reports didn't include the same option. That is, until the latest 0.8 release, which introduced a new Experimental: Add Missing Prices parameter. When enabled, the report combines costs and prices together to populate the missing prices and calculate more accurate savings. Please be aware that the reason this is labeled as “experimental” is because both the cost and price datasets can be large and combining them can add significant time to your data refresh times. If you're already struggling with slow refresh times, you may want to consider using FinOps hubs with Data Explorer. In general, we recommend FinOps hubs with Data Explorer for any account that monitors over $100K in total spend. (Your time is typically more valuable than the extra $125 per month.) To enable the feature, start by creating a Cost Management export for the price sheet. Then update parameters for your report to set the Experimental: Add Missing Prices parameter to true. Once enabled, you'll start to see additional savings from reservations. While this data is available in all reports, you can generally see savings on three pages within the Rate optimization report. The Summary page shows a high-level breakdown of your cost with the details that help you quantify negotiated discount and commitment discount savings. In this release, you'll also find Effective Savings Rate (ESR), which shows your total savings compared to the list cost (what you would have paid with no discounts). The Total savings page is new in this release and shows that same cost and savings breakdown over time. And lastly, the Commitment discount savings page shows gives you the clearest picture of the fix for MCA accounts by showing the contracted cost and savings for each reservation instance. If savings are important for your organization, try the new Add Missing Prices option and let us know how it works for you. And again, if you experience significant delays in data refresh times, consider deploying FinOps hubs with Data Explorer. This is our at-scale solution for everyone. Performance improvements for Power BI reports Between gradually increased load times for storage reports and learnings from the initial release of KQL reports in 0.7, we knew it was time to optimize both sets of reports. And we think you'll be pretty excited about the updates. For those using storage reports, we introduced a new Deprecated: Perform Extra Query Optimization parameter that disables some legacy capabilities that you may not even be using: Support for FOCUS 1.0-preview. Tracking data quality issues with the x_SourceChanges column. Fixing x_SkuTerm values to be numbers for MCA. Informative x_FreeReason column to explain why a row might have no cost. Unique name columns to help distinguish between multiple objects with the same display name. Most organizations aren't using these and can safely disable this option. For now, we're leaving this option enabled by default to give people time to remove dependencies. We do plan to disable this option by default in the future and remove the option altogether to simplify the report and improve performance. Cosmetic and informational transforms will be disabled by default in 0.9 and removed on or after July 1, 2025 to improve Power BI performance. If you rely on any of these changes, please let us know by creating an issue in GitHub to request that we extend this date or keep the changes indefinitely. For those using KQL reports that use FinOps hubs with Data Explorer, you'll notice a much more significant change. Instead of summarized queries with a subset of data, KQL reports now query the full dataset using a single query. This is made possible through a Power BI feature called DirectQuery. DirectQuery generates queries at runtime to streamline the ingestion process. What may take hours to pull data in a storage report takes seconds in KQL reports. The difference is astounding. Let me state this more explicitly: If you're struggling with long refresh times or need to setup incremental refresh on your storage reports, you should strongly consider switching to FinOps hubs with Data Explorer. You'll get full fidelity against the entire dataset with less configuration. Important note for organizations that spend over $100K I've already stated this a few times, but for those skimming the announcement, I want to share that we've learned a lot over the past few months as organizations big and small moved from storage to KQL reports in Power BI. With a base cost of $130 per month, we are now recommending that any organization who needs to monitor more than $100,000 in spend should deploy FinOps hubs with Data Explorer. While we won't remove storage as an option for those interested in a low-cost, low-setup solution, we do recognize that Data Explorer offers the best overall value to cost. And as we look at our roadmap, it's also important to note that Data Explorer will be critical as we expand to cover every FinOps capability. From allocation through unit economics, most capabilities require an analytical engine to break down, analyze, and even re-aggregate costs. At less than 0.2% of your total spend, we think you'll agree that the return is worth it. Most organizations see this as soon as they open a KQL report and it pulls data in seconds when they've been waiting for hours. Give it a shot and let us know what you think. We're always looking for ways to improve your experience. We think this is one of the biggest ways to improve and the great thing is it's already available! New Data Explorer dashboard for FinOps hubs With the addition of Data Explorer in FinOps hubs 0.7, we now have access to a new reporting tool built into Azure Data Explorer and available for free to all users! Data Explorer dashboards offer a lighter weight reporting experience that sits directly on the data layer, removing some of the complexities of Power BI reporting. Of course, Data Explorer dashboards aren't a complete replacement for Power BI. If you need to combine data from multiple sources, Power BI will still be the best option with its vast collection of connectors. This is just another option you have in your toolbelt. In fact, whether you use Power BI reports or not, we definitely recommend deploying the Data Explorer dashboard. Deploying the dashboard is easy. You import the dashboard from a file, connect it to your database, and you're ready to go! And once you setup the dashboard, you'll find pages organized in alignment with the FinOps Framework, similar to the Power BI reports. You'll find a few extra capabilities broken out in the dashboard compared to Power BI, but the functionality is generally consistent between the two, with some slight implementation differences that leverage the benefits of each platform. If you're familiar with the Power BI reports, you may notice that even this one screenshot is not directly comparable. I encourage you to explore what's available and make your own determination about which tool works best for you and your stakeholders. Before I move on to the next topic, let me call out my favorite page in the dashboard: The Data ingestion page. Similar to Power BI, the Data ingestion page includes details about the cost of FinOps hubs, but much more interesting than that is the ingested data, which is broken down per dataset and per month. This gives you an at-a-glance view of what data you have and what you don't! This level of visibility is immensely helpful when troubleshooting data availability or even deciding when it's time to expand to cover more historical data! Whether you choose to keep or replace your existing Power BI reports, we hope you'll try the Data Explorer dashboard and let us know what you think. They're free and easy to set up. To get started, see Configure the Data Explorer dashboard. About the FinOps hubs data model While on the subject of Data Explorer, I'd also like to call out some new, updated, and even deprecated KQL functions available in FinOps hubs as well as how to learn more about these and other functions and tables. I'll start by calling out that FinOps hubs with Data Explorer established a model for data ingestion that prioritizes backward compatibility. This may not be evident now, with only having support for FOCUS 1.0, but you will see this as we expand to support newer FOCUS releases. This is a lot to explain, so I won't get into it here, but instead I'll point you to where you can learn more at the end of this section. For now, let me say that you'll find two sets of functions in the Hub database: versioned and unversioned. For instance, Costs() returns all costs with the latest supported FOCUS schema (version 1.0 today), while Costs_v1_0() will always return FOCUS 1.0 data. This means that, if we were to implement FOCUS 1.1, Costs() would return FOCUS 1.1 data and Costs_v1_0() would continue to return FOCUS 1.0, whether the data was ingested with 1.0, 1.1, or even 1.0-preview, which we continue to support. I can cover this more in-depth in a separate blog post. There's a lot to versioning and I'm very proud of what we're doing here to help you balance up-to-date tooling without impacting existing reports. (This is another benefit of KQL reports over storage reports.) The key takeaway here is to always use versioned functions for tooling and reports that shouldn't change over time, and use unversioned functions for ad-hoc queries where you always want the latest schema. Beyond these basic data access functions, we also offer 15 helper functions for common reporting needs. I won't go over them all here, but will call out a few additions, updates, and replacements. Most importantly, we identified some performance and memory issues with the parse_resourceid() function when run at scale for large accounts. We resolved the issue by extracting a separate resource_type() function for looking up resource type display names for resources. This is mostly used within internal data ingestion, but also available for your own queries. The main callout is that, if you experienced any memory issues during data ingestion in 0.7, please look at 0.8. We're seeing some amazing performance and scale numbers with the latest update. As you can imagine, FinOps reports use a lot of dates. And with that, date formatting is mandatory. In 0.8, we renamed the daterange() function to datestring() to better represent its capabilities and also extracted a new monthstring() function for cases when you only need the month name. datestring(datetime, [datetime]) returns a formatted date or date range abbreviated based on the current date (e.g., “Jan 1”, “Jan-Feb 2025”, “Dec 15, 2024-Jan 14, 2025”). monthstring(datetime, [length]) returns the name of the month at a given string length (e.g., default = “January”, 3 = “Jan”, 1 = “J”). We also updated the numberstring() function to support decimal numbers. (You can imagine how that might be important for cost reporting!) numberstring(num, [abbrev]) returns a formatted string representation of the number based on a few simple rules that only show a maximum of three numbers and a magnitude abbreviation (e.g., 1234 = “1.23K”, 12345678 = “12.3M”). And of course, these are just a few of the functions we have available. To learn more about the data model available in Power BI or Data Explorer, see FinOps hubs data model. This article will share details about managed datasets in FinOps hubs, Power BI functions used in both KQL and storage reports, Power BI tables, and KQL functions available in both Power BI and Data Explorer dashboards. If you're curious about the tables, functions, and even details about how versioning works, this will be a good reference to remember. Simplified network architecture for public routing In 0.7, we introduce a much-anticipated feature to enable FinOps hubs with private network routing (aka, private endpoints). As part of this update, we added all FinOps hubs components into a dedicated, isolated network for increased security. And after the release, we started to receive immediate feedback from those who prefer the original public routing option from 0.6 and before, which was not hosted within an isolated network. Based on this feedback, we updated the public routing option to exclude networking components. This update simplifies the deployment and better aligns with what most organizations are looking for when using public routing: We also published new documentation to explain both the public and private routing options in detail. If you're curious about the differences or planning to switch to one or the other, you'll want to start with Configure private networking in FinOps hubs. Configuring private networking requires some forethought, so we recommend you engage your network admins early to streamline the setup process, including peering and routing from your VPN into the isolated FinOps hubs network. I also want to take a quick moment to thank everyone who shared their feedback about the networking changes. This was an amazing opportunity to see our tiny open-source community come together. We rallied, discussed options openly, and pivoted our direct to align with the community's preferred design direction. I'm looking forward to many more open discussions and decisions like this. The FinOps toolkit is for the community, by the community, and this has never been more apparent than over the last few months. Thank you all for making this community shine! Managing exports and hubs with PowerShell We probably don't do a good enough job raising awareness about the FinOps toolkit PowerShell module. Every time I introduce people to it, they always come back to me glowing with feedback about how much time it saved them. And with that, we made some small tweaks based on feedback we heard from FinOps toolkit users. Specifically, we updated commands for creating and reading Cost Management exports, and deleting FinOps hubs. Let's start with exports… The New-FinOpsCostExport command creates a new export. But it's not just a simple create call, like most PowerShell commands. One of the more exciting options is the -Backfill option, which allows you to backfill historical data up to 7 years with a single call! But this isn't new. In 0.8, we updated New-FinOpsCostExport to create price sheet, reservation recommendation, and reservation transaction exports. With this, we added some new options for reservation recommendations and system-assigned identities. The Get-FinOpsCostExport command retrieves all exports on the current scope based on a set of filters. While updating other commands, we updated the command to return a more comprehensive object and renamed some of the properties to be clearer about their intent. And just to call out another popular command: The Start-FinOpsCostExport command allows you to run exports for an existing export. This is most often used when backfilling FinOps hubs but works in any scenario. This command is what's used in the New-FinOpsCostExport command. Lastly, we were asked to improve the confirmation experience for the Remove-FinOpsHub command (#1187). Now, the command shows a list of resources that will be deleted before confirming delete. Simple, but helpful. There's a lot we can do with PowerShell. So much it's hard to know where to go next. If you find yourself looking for anything in particular, please don't hesitate to let us know! We're generally waiting for a signal from people like you who need not just automation scripts, but any tools in the FinOps space. So if you find something missing, create an issue to let us know how we can help! Other new and noteworthy updates Many small improvements and bug fixes go into each release, so covering everything in detail can be a lot to take in. But I do want to call out a few other small things that you may be interested in. In the Implementing FinOps guide: Added the Learning FOCUS blog series to the FOCUS overview doc. In FinOps hubs: Clean up ResourceType values that have internal resource type IDs (for example, microsoft.compute/virtualmachines). Updated the default setting for Data Explorer trusted external tenants from “All tenants” to “My tenant only”. This change may cause breaking issues for Data Explorer clusters accessed by users from external tenants. Updated CommitmentDiscountUsage_transform_v1_0() to use parse_resourceid(). Documentation updates covering required permissions and supported datasets. Fixed timezones for Data Factory triggers to resolve issue where triggers would not start due to unrecognized timezone. Fixed an issue where x_ResourceType is using the wrong value. This fix resolves the issue for all newly ingested data. To fix historical data, reingest data using the ingestion_ExecuteETL Data Factory pipeline. Added missing request body to fix the false positive config_RunExportJobs pipeline validation errors in Data Factory. Deprecated the monthsago() KQL function. Please use the built-in startofmonth(datetime, [offset]) function instead. In Power BI reports: Added the Pricing units open dataset to support price sheet data cleanup. Added PricingUnit and x_PricingBlockSize columns to the Prices table. Added Effective Savings Rate (ESR) to Cost summary and Rate optimization reports. Expanded the columns in the commitment discount purchases page and updated to show recurring purchases separately. Fixed a date handling bug that resulted in a “We cannot apply operator >= to types List and Number” error (#1180). If you run into issues, set the report locale explicitly to the locale of the desired date format. In FinOps workbooks: On the Optimization workbook Commitment discounts tab, added Azure Arc Windows license management. On the Optimization workbook, Enabled “Export to CSV” option on the Idle backupsquery. On the Optimization workbook, Corrected VM processor details on the Computetab query. In Azure optimization engine: Improved multi-tenancy support with Azure Lighthouse guidance. In open data: Added 4 new region mappings to existing regions. Added the “1000 TB” pricing unit. Added 45 new and updated 52 existing resource types. Added 4 new resource type to service mappings. Thanking our community As we approach the two-year anniversary of our first public release, I have to look back and acknowledge how far we've come. We all want to do more and move faster, which makes it easy to get lost in the day-to-day work our community does and lose sight of the progress we're making. There are honestly too many people to thank, so I won't go into listing everyone, but I do want to send out an extra special thank you to the non-Microsoft contributors who are making this community and its tools better. I'll start the list off strong with Roland Krummenacher, a consultant who specializes in Azure optimization. He and his team built a tool similar to FinOps hubs and, after seeing 0.7 ship with Data Explorer, rearchitected their tool to extend FinOps hubs. Roland's team helps clients optimize their environment and build custom extensions to FinOps hubs that drive value realization. We're collaborating regularly to build a plan on how to bring some of their extensions into the toolkit. Several 0.8 improvements were made because of our collaboration with Roland. Next up is Graham Murphy, a FinOps professional who's been using FinOps hubs since the early days. Graham has always been amazingly collaborative. He extended FinOps hubs to bring in GCP and AWS FOCUS data and often shares his experiences with the FinOps community on Slack. Graham is also part of the FOCUS project, which has also proven useful. Speaking of FOCUS, Brian Wyka is an engineer who provided some feedback on our FOCUS documentation. But what impressed me most is that not only did Brian give us feedback, but he also engaged deeply in our pull request to address his feedback. It was amazing to see him stick to the topic through to the end. Similar to Graham, John Lundell is a FinOps practitioner who also extended FinOps hubs and is sharing his experiences with the community. John took the time to document his approach for using FinOps hubs to get data into Microsoft Fabric. For those interested, check out Sharing how we are enhancing the toolkit for bill-back purposes. Eladio Rincón Herrera has been with us for over a year now. The thing that really stands out to me about Eladio is the depth in which he gives context. This has helped immensely in a few times as we've narrowed down issues that not only he, but others were facing. Eladio's engagement in our discussion forums has helped many others both directly and indirectly. It's always a pleasure to work with Eladio! Psilantropy has also been with us for over a year. They have been quite prolific over that time as well, sharing ideas, issues, and supporting discussions across four separate tools! Their reports are always extremely detailed and immensely helpful in pinpointing the underlying problem or fully understanding the desired feature request. And now for someone who holds a special place in my heart: Patrick K. Patrick is an architect who leveraged FinOps hubs within his organization and needed to add private endpoints. He took the time to submit a pull request to contribute those changes back to the product. This was our first major external pull request, which is what made it so special. This spun up many discussions and debates on approaches that took time to get in, but I always look back to Patrick as the one who really kickstarted the effort with that first pull request! Of course, this isn't everyone. I had to trim the list of people down a few times to really focus on a select few. (I'm sure I'll feel guilty about skipping someone later!) And that doesn't even count all the Microsoft employees who make the FinOps toolkit successful – both in contributions and through supporting the community. I'm truly humbled when I see how this community has grown and continues to thrive! Thank you all! What's next As we rounded out 2024, I have to say I was quite proud of what we were able to achieve. And coming into 2025, I was expecting a lightweight initial release. But we ended up doing much more than we expected, which is great. We saw some amazing (and unexpected) improvements in this release. And while I'd love to say we're going to focus on small updates, I have to admit we have some lofty goals. Here are a few of the things we're looking at in the coming months: FinOps hubs will add support for ingesting data into Microsoft Fabric eventhouses and introduce recommendations, similar to what you see in Azure Optimization Engine and FinOps workbooks. Power BI reports will add support for Microsoft Fabric lakehouses. FinOps hubs and Power BI will both get updated to the latest FOCUS release. FinOps workbooks will continue to get recurring updates, expand to more FinOps capabilities, and add cost from FinOps hubs. Azure Optimization Engine will continue to receive small updates as we begin to bring some capabilities into FinOps hubs in upcoming releases. Each release, we'll try to pick at least one of the highest voted issues (based on 👍 votes) to continue to evolve based on your feedback, so keep the feedback coming! To learn more, check out the FinOps toolkit roadmap, and please let us know if there's anything you'd like to see in a future release. Whether you're using native products, automating and extending those products, or using custom solutions, we're here to help make FinOps easier to adopt and implement. View the full article
-
Outlook Newsletters are intended for internal communications, at least for the preview. It’s possible to take the HTML for a newsletter and send it with Azure Email Communication Services (ECS), the PAYG service for bulk email. It sounds like a good way to use Outlook Newsletters to share information with customers and other external recipients. Some manual intervention makes everything works. It would be great if Microsoft tweaked Outlook to remove the rough edges. https://office365itpros.com/2025/03/12/outlook-newsletters-ecs/ View the full article
-
Who has more details? View the full article
-
Hey everyone... Not a big techy person and I'm at the point of frustration where i have no idea what's causing these BSOD anymore.. I've tried numerous things on youtube/google to solve the issues but they keep happening i would really appreciate some help if possible to try and solve this issue. I will randomly blue screen or my pc will restart sometimes when I'm gaming or even just watching a youtube video. At first i thought it was just my ram so i went out and spent 150$ on new ram.... but it has continued to keep getting BSOD even after the new sticks went in. If I'm lucky i can go a few hours without a bluescreen but sometimes if i blue screen it will do it every 5 minutes or so until it gives up and lets me play for a while - Im not sure if there is a setting that's wrong in my pc or anything but i have tried many things to even check for outdated drivers and to my knowledge i have it all updated. I have even tried resetting my pc and keeping personal files.. View the full article
-
How to connect two PC using USB cables? I've been researching how to transfer files between PC's, it looks like the easiest would be to use a USB data transfer cable. I found these instructions on-line: Transfering Files Using a USB-to-USB Data Transfer Cable View the full article
-
I tried installing via a Rufus USB installer two or three times in a row. Each time, after the reboot, it got stuck on: "Loading operating system...". Then, I thought about installing Windows 10 and then updating to Windows 11. However, during the process, it BSODed, and the changes were reverted. This happened twice. Next, I tried the Rufus USB installer again, and it BSODed with different errors on the second reboot. Then, I tried to boot from a Windows 11 to go USB drive, and it resulted in watchdog BSODs. I was losing hope. Then, I had the idea to use an old HP ProBook 6540b that had a similar CPU (Intel Core i5 520M). I installed Windows 11 on it and then plugged the SSD into my main PC. It initially failed, but then the recovery menu appeared, and I booted in safe mode. It took a while, and after a restart, the magic happened: it successfully booted into Windows 11. The only issue was that the Ethernet card wasn't working. To fix it, I went to Device Manager (during the Wi-Fi connection prompt for sign-in) and tried to automatically install the driver for the Ethernet card, which resulted in another BSOD. (Later, I fixed it by downloading the driver from the Realtek website.) So, that's my story. I hope that you could understand it. View the full article
-
After install of windows always 2 keyboards layouts
Windows Server posted a topic in Windows Servers
When i do a clean installation of windows 11 and also of windows 10. Make a user than get the user 2 keyboard layouts for example Belgium and USA. Ore Belgium and Netherlands layouts. How can I make that when i make de first user only the Belgium keybord layout is made and no other. View the full article -
Right click on desktop shortcut to Open file location does not work except for Excel, Word shortcut. Anyone else experience this? If I right click on desktop ANY shortcut icon in Windows 10 on other laptop next to me and select Open File Location-it will open any location for ANY shortcut or program. I have the same programs on Windows 11 laptop and Bat and other files. When I Right click on desktop in Windows 11 23H2 (have not updated in a month or so due to prob I read with updates) and select "Open File Location" , it yields no results. Nothing opens UNLESS it is the 2 shortcuts Word and Excel shortcuts. Right click and select Troubleshooter compatibility also does not work but never tried it from desktop so I do not care. I have normal things on desktop like Zoom, , Macrium, FileZilla Pro, Chrome, Firefox, various shortcuts to folders... Works within the Start menu or if I select just Open in the start menu. But on desktop, nope except MSFT Word and Excel. Same for Open Folder Location. Not a big deal but just wondering. Thank you very much! View the full article
-
I have an HP ALL-IN-ONE PC that is having trouble exiting sleep mode once the mouse or keyboard is touched. I have checked the power management settings in Device Manager, and they all look correct as they should to allow the peripheral devices to wake the computer from sleep. But they don't wake it, so I must press the POWER button on the computer. The POWER button shuts it down (I guess you could say manually) and restarts the computer. I don't know if something is crashing when I press the ENTER key or touch the mouse or what. Thanks for the help ahead! View the full article
-
I think is it pretty cool and it is now on devanart. View the full article
-
I'd like just the desktop and icons from many Windows version previously. The updates that have changed this seem to have happened recently. I'm currently 24H2, 26100.3476, Windows Feature Experience Pack 1000.26100.54.0 I don't generally use the Edge browser that it seems to launch asking for either a URL or search. I haven't gone to the required efforts to make Brave my default but it is what I use most of the day. I have a couple work related webs that I use Edge for but something recently seems to have created the issue I see. Startup seems to give me the desktop that I want but after maybe 15 or so seconds launches what I believe is a browser of sorts. View the full article
-
Hi, guys, I recently wanted to find a YouTube to MP3 converter, mainly to download some copyright-free background music, which is convenient for offline listening on computers and mobile phones. However, many online free YouTube to MP3 converter websites are full of advertisements, and some tools prompt that there are risks after downloading, so I am not very confident to use them. So I would like to ask everyone, is there any best and safe YouTube to MP3 converter recommended in 2025? I am more concerned about the following points: Safety: There will not be a lot of ads or viruses, and it is best to have an ad-free tool.Sound quality: Support 320kbps high-quality download, and do not want the sound quality to be lost too much after conversion.Ease of use: Both the web version and the software can be considered, but it is best to be free or paid once, and do not want a subscription system.Support batch download: It would be better if you can convert the audio of multiple videos at once.I have used some online conversion websites before, but I found that they are either full of ads, or have speed limits when downloading, and some of them can no longer be opened. If you are using a conversion tool, I hope you can share your experience, thank you! View the full article
-
Take the time you need to relax or handle personal matters without worry. A well-crafted out-of-office message can ensure everything runs smoothly in your absence. And you don't have to do it alone! Microsoft Copilot is here to help you create the perfect out-of-office message for Outlook, Teams, or any other communication app quickly and easily. #Copilot #Microsoft365 #Productivity #MPVbuzz #CopilotForM365 #M365 View the full article