Quantcast
Channel: Scott Hanselman's Blog
Viewing all 1148 articles
Browse latest View live

Exploring a minimal WebAPI with ASP.NET Core

$
0
0

They are still working on the "dotnet new" templates, but you can also get cool templates from "yo aspnet" usingn Yeoman. The generator-aspnet package for Yeoman includes an empty web app, a console app, a few web app flavors, test projects, and a very simple Web API application that returns JSON and generally tries to be RESTful.

yo aspnet

The startup.cs is pretty typical and basic. The Startup constructor sets up the Configuration with an appsettings.json file and add a basic Console logger. Then by calling "UseMvc()" we get to use all ASP.NET Core which includes both centralized routing and attribute routing. ASP.NET Core's controllers are unified now, so there isn't a "Controller" and "ApiController" base class. It's just Controller. Controllers that return JSON or those that return Views with HTML are the same so they get to share routes and lots of functionality.

public class Startup

{
public Startup(IHostingEnvironment env)
{
var builder = new ConfigurationBuilder()
.SetBasePath(env.ContentRootPath)
.AddJsonFile("appsettings.json", optional: true, reloadOnChange: true)
.AddJsonFile($"appsettings.{env.EnvironmentName}.json", optional: true)
.AddEnvironmentVariables();
Configuration = builder.Build();
}

public IConfigurationRoot Configuration { get; }

// This method gets called by the runtime. Use this method to add services to the container.
public void ConfigureServices(IServiceCollection services)
{
// Add framework services.
services.AddMvc();
}

// This method gets called by the runtime. Use this method to configure the HTTP request pipeline.
public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory)
{
loggerFactory.AddConsole(Configuration.GetSection("Logging"));
loggerFactory.AddDebug();

app.UseMvc();
}
}

Then you can make a basic controller and use Attribute Routing to do whatever makes you happy. Just by putting [HttpGet] on a method makes that method the /api/Values default for a simple GET.

using System;

using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;

namespace tinywebapi.Controllers
{
[Route("api/[controller]")]
public class ValuesController : Controller
{
// GET api/values
[HttpGet]
public IEnumerable<string> Get()
{
return new string[] { "value1", "value2" };
}

// GET api/values/5
[HttpGet("{id}")]
public string Get(int id)
{
return "value";
}

// POST api/values
[HttpPost]
public void Post([FromBody]string value)
{
}

// PUT api/values/5
[HttpPut("{id}")]
public void Put(int id, [FromBody]string value)
{
}

// DELETE api/values/5
[HttpDelete("{id}")]
public void Delete(int id)
{
}
}
}

If we run this with "dotnet run" and call/curl/whatever to http://localhost:5000/api/Values we'd get a JSON array of two values by default. How would we (gasp!) add XML as a formatting/serialization option that would respond to a request with an Accept: application/xml header set?

I'll add "Microsoft.AspNetCore.Mvc.Formatters.Xml" to project.json and then add one method to ConfigureServices():

services.AddMvc()
        .AddXmlSerializerFormatters();

Now when I go into Postman (or curl, etc) and do a GET with Accept: application/xml as a header, I'll get the same object expressed as XML. If I ask for JSON, I'll get JSON.

 Postman is a great way to explore WebAPIs

If I like, I can create my own custom formatters and return whatever makes me happy. PDFs, vCards, even images.

Next post I'm going to explore the open source NancyFx framework and how to make minimal WebAPI using Nancy under .NET Core.


Sponsor: Thanks to Aspose for sponsoring the feed this week! Aspose makes programming APIs for working with files, like: DOC, XLS, PPT, PDF and countless more.  Developers can use their products to create, convert, modify, or manage files in almost any way. Aspose is a good company and they offer solid products. Check them out, and download a free evaluation!



© 2016 Scott Hanselman. All rights reserved.
     

Exploring a minimal WebAPI with .NET Core and NancyFX

$
0
0

WOCinTechChat photo used under CCIn my last blog post I was exploring a minimal WebAPI with ASP.NET Core. In this one I wanted to look at how NancyFX does it. Nancy is an open source framework that takes some inspiration from Ruby's "Sinatra" framework (get it? Nancy Sinatra) and it's a great alternative to ASP.NET. It is an opinionated framework - and that's good thing. Nancy promotes what they call the "super-duper-happy-path." That means things should just work, they should be easy to customize, your code should be simple and Nancy itself shouldn't get in your way.

As I said, Nancy is open source and hosted on GitHub, so the code is here https://github.com/NancyFx/Nancy. They're working on a .NET Core version right now that is Nancy 2.0, but Nancy 1.x has worked great - and continues to - on .NET 4.6 on Windows. It's important to note that Nancy 1.4.3 is NOT beta and it IS in production.

As of a few weeks ago there was a Beta of Nancy 2.0 on NuGet so I figured I'd do a little Hello Worlding and a Web API with Nancy on .NET Core. You should explore their samples in depth as they are not just more likely to be correct than my blog, they are just MORE.

I wanted to host Nancy with the ASP.NET Core "Kestrel" web server. The project.json is simple, asking for just Kestrel, Nancy, and the Owin adapter for ASP.NET Core.

{
  "version": "1.0.0-*",
  "buildOptions": {
    "debugType": "portable",
    "emitEntryPoint": true
  },
  "dependencies": {
    "Microsoft.NETCore.App": {
      "version": "1.0.0",
      "type": "platform"
    },
    "Microsoft.AspNetCore.Server.Kestrel": "1.0.0",
    "Microsoft.AspNetCore.Owin": "1.0.0",
    "Nancy": "2.0.0-barneyrubble"
  },
  "commands": {
    "web": "Microsoft.AspNet.Server.Kestrel"
  },
  "frameworks": {
    "netcoreapp1.0": {}
  }
}

And the Main is standard ASP.NET Core preparation. setting up the WebHost and running it with Kestrel:

using System.IO;

using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Hosting;

namespace NancyApplication
{
public class Program
{
public static void Main(string[] args)
{
var host = new WebHostBuilder()
.UseContentRoot(Directory.GetCurrentDirectory())
.UseKestrel()
.UseStartup<Startup>()
.Build();

host.Run();
}
}
}

Startup tells ASP.NET Core via Owin that Nancy is in charge (and sure, didn't need to be its own file or it could have been in UseStartup in Main as a lambda)

using Microsoft.AspNetCore.Builder;
using Nancy.Owin;
namespace NancyApplication
{
   public class Startup
    {
        public void Configure(IApplicationBuilder app)
        {
            app.UseOwin(x => x.UseNancy());
        }
   }
}

Here's where the fun stuff happens. Check out a simple Nancy Module.

using Nancy;
namespace NancyApplication
{
        public class HomeModule : NancyModule
    {
        public HomeModule()
        {
            Get("/", args => "Hello World, it's Nancy on .NET Core");
        }
    }
}

Then it's just "dotnet restore" and "dotnet run" and I'm in business. But let's do a little more. This little bit was largely stolen from Nancy's excellent samples repository. Here we've got another route that will respond to a GET to /test/Scott, then make and return a new Person() object. Since I'm going to pass in the Accept: application/json header I'll get JSON back.

using Nancy;
 
namespace NancyApplication
{
    public class HomeModule : NancyModule
    {
        public HomeModule()
        {
            Get("/", args => "Hello from Nancy running on CoreCLR");
            Get("/test/{name}", args => new Person() { Name = args.name });
        }
    }
    public class Person
    {
        public string Name { get; set; }
    }
}

I'm using the Postman application to test this little Web API and you can see the JSON response below:

Postman shows a JSON object coming back from a GET request to a Web API

Nancy is a very complete and sophisticated framework with a LOT of great code to explore. It's extremely modular and flexible. It works with ASP.NET, with WCF, on Azure, with Owin, alongside Umbraco, with Mono, and so much more. I'm looking forward to exploring their .NET Core version as it continues development.

Finally, if you're a new open source contributor or are considering being a First Timer and help out an open source project, you might find the idea of getting involved with such a sophisticated project intimidating. Nancy participates in UpForGrabs and has some issues that are marked as "up for grabs" that could be a good starter point where you could help out a deserving project AND get involved in open source.

* WoCTechChat photo used under CC


Sponsor: Thanks to Aspose for sponsoring the feed this week! Aspose makes programming APIs for working with files, like: DOC, XLS, PPT, PDF and countless more.  Developers can use their products to create, convert, modify, or manage files in almost any way. Aspose is a good company and they offer solid products. Check them out, and download a free evaluation!


© 2016 Scott Hanselman. All rights reserved.
     

GitKraken Pro git client teaming up with NightScout to support Open Source Diabetes software

$
0
0
GitKraken + NightScout

There's a great new cross-platform Git client available now called GitKraken. It's totally free to download and but the optional Pro version includes some great features like a Merge Conflict editor, profile support to keep work and personal separate and more. It's $6 a month or just $5 a month if you go annually.

We've talked to the folks at GitKraken and we've brokered an amazing deal. My friend Hamid Shojaee who owns Axosoft has always supported open source and is a generous charitable giver. He's always asking about my diabetes and the software I use to manage it. As you may know, I use the open source Nightscout application running in Azure to visualize my blood sugar in near-real-time. My wife and family can see my numbers and support me remotely. If you are diabetic or have anyone with diabetes in your life, you'll quickly find that NightScout is indispensible. For parents of children with diabetes it's truly life-changing.

Since GitKraken is great for Git and working with software hosted on GitHub and Nightscout is all open source and hosted on GitHub partnering up seemed very natural. GitKraken is going to give all the revenue for the first month's sales to the Nightscout Foundation!

As well as providing awesome additional features, upgrading to GitKraken Pro is an opportunity to help raise money and increase awareness for the Nightscout Foundation, a nonprofit that is improving the lives of people and families affected by type 1 diabetes, through their support and creation of open source diabetes management systems.

So, until August 28, 2016, 100% of first month revenues from the sales of GitKraken Pro will be donated to the Nightscout Foundation. There is no upper limit to how much money Axosoft will donate for the month, but we’ll be making a minimum commitment of $5,000!

GitKraken is written in Electron and is totally cross-platform. It's tree-view is a standout feature in my opinion and makes it a lot easier to visualize complex repositories and branches.

GitKraken

Again, it's free but you can optionally upgrade to Pro if you like it and want some more advanced features. And, for the next month if you do upgrade GitKraken to Pro you'll be helping the Nightscout Foundation support the development of opens source software to fight diabetes!


Sponsor: I want to thank Stackify for sponsoring the blog this week, and what's more for gifting the world with Prefix. It's a must have .NET profiler for your dev toolbox. Do yourselves a favor and download it now—free!



© 2016 Scott Hanselman. All rights reserved.
     

Collecting Windows 10 "Anniversary Edition" Keyboard Shortcuts

$
0
0

The new Windows 10 Calendar widget is lovelyI'm a big fan of keyboard shortcuts.

There's a fantastic list of Windows 10 shortcuts *inside* the Windows 10 Insiders "Feedback Hub" app. The in-app direct link (not a web link) is here but I think the list is too useful not to share so I don't think they will mind if I replicate the content here on the web.

There is also a nice support page that includes a near-complete list of Keyboard Shortcuts for Windows 7, 8.1 and 10.

"We asked our engineers on the team to share some of their favorite (and lesser-known) keyboard shortcuts for Windows 10. Here is the list!"

Note: [NEW] denotes a new keyboard shortcut introduced in the the Windows 10 Anniversary Update.

Quick access to basic system functions:

  • Ctrl + Shift + Esc: Opens Task Manager.
  • WIN + F: Opens the Feedback Hub with a screenshot attached to your feedback. 
  • WIN + I: Opens the Settings app. 
  • WIN + L: Will lock your PC. 
  • WIN + X: Opens a context menu of useful advanced features.
  • WIN + X and A: Opens Command Prompt with administrative rights. 
  • WIN + X and P: Opens Control Panel. 
  • WIN + X and M: Opens Device Manager.
  • WIN + X and U then S: Puts your PC to sleep. 
    • Scott: Or just push the power button on most laptops or close the lid
  • WIN + Down: Minimizes an app. 
  • WIN + Up: Maximizes an app. 

Capturing what’s on your screen:

  • Alt + PrtScrn: Takes a screenshot of open window and copies to your clipboard. 
  • WIN + PrtScrn: Takes a screenshot of your entire desktop and saves it into a Screenshots folder under Photos in your user profile. 
  • WIN + Alt + R: Start/stop recording your apps & games. 

Mastering File Explorer:

  • Alt + D in File Explorer or browser: Puts you in the address bar. 
  • F2 on a file: Renames the file. 
  • Shift + Right-click in File Explorer: Will give you option to launch Command Prompt with whatever folder you are in as the starting path. 
  • Shift + Right-click on a file: “Copy as path” is added to the context menu.
    • Scott: These two are gold. Copy as path has been around for years and is super useful.

For the taskbar:

  • WIN + <number>: Opens whatever icon (app) is in that position on the taskbar. 
  • [NEW] WIN + Alt + D: Opens date and time flyout on the taskbar.  
    • Scott: I love the new calendar stuff in Windows 10. You just click the clock in the corner and you get not only clock and calendar but also your agenda for the day from your calendar. I think Windows 10 should include more stuff like this going forward - integrating your mail, calendar, plan, trips, commutes, directly in the OS and not just in Apps. That's one of the reasons I like Live Tiles. I like to see information without launching formal apps.  I like widgets on iOS and Android.
  • WIN + S: Search for apps and files. Just type the app name (partially) or executable name (if you know it) and press Enter. Or Ctrl + Shift+ Enter if you need this elevated.
  • WIN + Shift + <number>: Opens a new window of whatever icon (app) is in that position on the taskbar (as will Shift + Click on the icon). 
  • WIN + Shift + Ctrl + <number>: Opens a new window of whatever icon (app) is in that position on the taskbar with administrative rights. 

Remote Desktop and Virtual Desktop:

  • CTRL + ALT + Left Arrow: VM change keyboard focus back to host.   
  • CTRL + ALT + HOME: Remote Desktop change keyboard focus back to host.

For example, in a VM, CTRL + ALT + Left Arrow then ALT + TAB lets you get focus back and switch to an app on your dev machine besides the VM.

Cortana:

  • [NEW] WIN + Shift + C: Opens Cortana to listen to an inquiry. 

Other neat keyboard shortcuts:

  • Alt + X in WordPad: Using on a selected character or word in WordPad will show/hides the Unicode.
  • Alt + Y on a UAC prompt: Automatically chooses yes and dismisses the prompt. 
  • Ctrl + mouse scroll-wheel: Scrolling will zoom and un-zoom many things across the OS. Middle clicking on the mouse scroll-wheel will dismiss tabs, open windows in taskbar, and notifications from the Action Center (new). 
  • Shift + F10: Will open the context menu for whatever is in focus. 

Here are some useful keyboard shortcuts on Surface devices: 

  • Fn + Left arrow: Home
  • Fn + Right arrow: End
  • Fn + Up arrow: Page Up
  • Fn + Down arrow: Page Down
  • Fn + Del: Increases screen brightness.
  • Fn + Backspace: Decreases screen brightness.
  • Fn + Spacebar: Takes a screenshot of the entire screen or screens and puts it into your clipboard. 
  • Fn + Alt + Spacebar: Takes a screenshot of an active window and puts it into your clipboard.

What are YOUR favorite keyboard shortcuts for Windows?


Sponsor: I want to thank Stackify for sponsoring the blog this week, and what's more for gifting the world with Prefix. It's a must have .NET profiler for your dev toolbox. Do yourselves a favor and download it now—free!


© 2016 Scott Hanselman. All rights reserved.
     

Two tools for quick and easy web application load testing during development

$
0
0

I was on the ASP.NET Community Standup this morning and Jon mentioned a new tool for load testing called "Netling." This got me to thinking about simple lightweight load testing in general. I've used large enterprise systems like SilkTest  as well as the cloud based load testing tools like those in Azure and Visual Studio. I've also used command-line tools like WCAT, an old but very competent load testing tool.

I thought I'd take a moment and look at two tools run locally. The goal is to see how easily I can do quick load tests and iterate on the results.

Netling

Netling is by Tore Lervik and is a nice little load tester client for easy and quick web testing. It's open source and on GitHub which is always nice. It's fun to read other people's code.

Netling includes both a WPF and Console client and is cleanly factored with a Core project that does all the work. With the WPF version you do test and then optionally mark that test as a baseline. Then you can make small changes as you like and do a quick retest. You'll get red (bad) or green (good) results if things get better. This should probably be adjusted to ensure it is visible for those with red-green color blindness. Regardless, it's a nice clean UI and definitely something you'll want to throw into your utilities folder and call upon often!

Do remember that it's not really nice to do load testing on web servers that you don't own, so be kind.

Note that for now there are no formal "releases" so you'll need to clone the repo and build the app. Fortunately it builds very cleanly with the free version of Visual Studio Community 2015.

Netling is a nice load tester for Windows

The Netling console client is also notable for its cool ASCII charts.

D:\github\Netling\Netling.ConsoleClient\bin\x64\Debug [master ≡]> .\netling.exe http://www.microsoft.com -t 8 -d 20


Running 20s test @ http://www.microsoft.com/
Threads: 8
Pipelining: 1
Thread afinity: OFF

1544 requests in 20.1s
Requests/sec: 77
Bandwidth: 3 mbit
Errors: 0
Latency
Median: 99.876 ms
StdDev: 10.283 ms
Min: 84.998 ms
Max: 330.254 ms





██
███
████████████████████ █ █ █
84.998 ms =========================================================== 330.254 ms

D:\github\Netling\Netling.ConsoleClient\bin\x64\Debug [master ≡]>

I'm sure that Tore would appreciate the help so head over to https://github.com/hallatore/Netling and file some issues but more importantly, perhaps chat with him and offer a pull request?

WebSurge

WebSurge is a more fully featured tool created by Rick Strahl. Rick is known in .NET spaces for his excellent blog. WebSurge is a quick free download for personal use but you should register it and talk to Rick if you plan on using it commercially or a lot as an individual.

WebSurge also speaks the language of the Fiddler Web Debugging Proxy so you can record and playback web traffic and generate somewhat sophisticated load testing scenarios. The session files are just test files that you can put in source control and share with other members of your team.

image

I realize there's LOT of choices out there.  These are just two really quick and easy tools that you can use as a developer to easily create HTTP requests and then play back at will and iterate during the development process.

What do YOU use for load testing and iterating on performance during development? Let us all know in the comments.


Sponsor: Big thanks to Redgate for sponsoring the feed this week. Could you deploy 1,000 databases? Imagine working in a 70-strong IT team, with 91 applications and 1,000+ databases. Now imagine deployment time. It’s not fiction, it’s fact. Read FlexiGroup's story.



© 2016 Scott Hanselman. All rights reserved.
     

Visual Studio 2015 - Fixing "Dependencies npm not installed" from fsevents with node on Windows

$
0
0

Maria and I were doing some work together on Thursday and I did a clone of one of her ASP.NET Core repositories and opened it in Visual Studio 2015 Community Edition on my local machine. Some of the JavaScript tool libraries didn't load so I went back into Solution Explorer to see if there was a problem and I saw this weird error that said (or at least, I read it as) "npm not installed."

image

That's weird, since I have npm installed. I dropped out to the command line and ran:

C:\Users\scott>npm --version

3.10.5

I also ran "where" to see where (and how many) npm was installed. Side note: WHERE is super useful and not used as often as it should be by y'all.

C:\Users\scott>where npm

C:\Program Files\nodejs\npm
C:\Program Files\nodejs\npm.cmd
C:\Users\scott\AppData\Roaming\npm\npm
C:\Users\scott\AppData\Roaming\npm\npm.cmd

This looks OK as two of those npms are shell scripts and not run on Windows. Just to make sure you have the version of npm you want, Felix made a VERY useful npm-windows-upgrade utility that you run like this, ironically with npm.

npm install --global --production npm-windows-upgrade

npm-windows-upgrade

I've used this tool many times with success. It also lets you choose the exact version you want and it's very smart.

However, then I realized that Visual Studio wasn't saying that npm wasn't installed, it was saying a dependency in the npm tree below wasn't installed. That hyphen - is intended to mean something. For the record, I think this is not intuitive and is a poor UX. Perhaps it should say "package(s) not installed" but you get the idea now.

I started manually opening the tree up one item at a time looking for the bad package - while quietly swearing to myself - until I realized that the Solution Explorer tree nodes are searchable.

image

This is UX issue number two, for me. I think the "broken" library should also include the BANG (!) or warning icon over it. Regardless, now I know I can quickly do a string search.

So what's up with the fsevents library?

I can check the Visual Studio Output Window and I see this:

npm WARN install Couldn't install optional dependency: Unsupported

npm WARN EPACKAGEJSON asp.net@0.0.0 No description

I dropped out to the command prompt into to the project's folder and run "npm ls" to see an ASCII tree that effectively is the same tree you see in Visual Studio. Note the screenshot with the ASCII tree below.

The output of "npm ls" is a nice ASCII tree

What's that "UNMET OPTIONAL DEPENDENCY?" for fsevents?

Looks like fsevents is a package that is only used on OSX, so it's an optional dependency that in the case of Windows is "unmet." It's a WARNING, of sorts. Perhaps it's more an "INFO" or perhaps, one could argue, it doesn't need to be shown at all on Windows.

A quick google shows that, well, the entire world is asking the same thing. So much so that it got pretty heated in this GitHub Issue asking a similar question.

Regardless, it seems to me that something inside Visual Studio really doesn't appreciate that warning/info/notOKness and says, "hey it's not installed." The part that is missing is that, well, it doesn't need to be installed, so Visual Studio should relax.

A Fix

Here's where it gets super interesting. Visual Studio (consider it from their point of view) needs to keep things consistent so it tests with, and ships with, a version of npm.

I can check that like this:

C:\>"C:\Program Files (x86)\Microsoft Visual Studio 14.0\Web\External\npm.cmd" --version

3.3.4

Again, this makes sense. That way when working in a large group we can all be sure we're on the same version of stuff. When you get a new version of the Web Tools for Visual Studio, presumably this version of npm gets updated. This also means that unless you tell Visual Studio otherwise, the npm and node you get when running a build in VS is gonna be different than if you do it outside - if you stick with the default settings.

Fear not. It's easily updated. I went over to Tools | Option in Visual Studio:

Tools | Options | External Web Tools has its own list of node copies, and it doesn't include mine

See the Web\External there? I added my node installation like this and made sure it was at the top.

Tools | Options | External Web Tools has its own list of node copies, and I have ADDED my path now.

Then I right click on npm and Restore Packages and all is right with the world as now I'm using npm 3.10.5 rather than 3.3.4.

There are no npm errors anymore

The Output Window for 3.10.5 shows this different output, which apparently stresses Visual Studio out less than npm 3.3.4. (Actually I think VS is calling an npm library, rather than shelling out and parsing the "npm ls --json" payload, but you get the idea.)

npm WARN optional Skipping failed optional dependency /chokidar/fsevents:

npm WARN notsup Not compatible with your operating system or architecture: fsevents@1.0.14

Adding my global npm/node folder fixes this issue for me and I can move on.

The Weird

BUT. In my fix I'm using npm globally - it's in my %PATH%. The change I made affects all Visual Studio projects on my machine, forever.

Maybe I need to be smarter? Maybe Visual Studio is already being smart. Note the second option there in the list? It's pointing to .\node_modules\bin. That's a LOCAL node-modules folder, right? Ah, I can just add specific version of npm to packages.json there if need be on a project by project basis without affecting my entire system OR changing my Visual Studio settings, right?

However, when I run my build, it's ignoring my project's locally installed npm!

PATH=.\node_modules\.bin;C:\Program Files (x86)\Microsoft Visual Studio 14.0\Web\External;%PATH%;C:\Program Files (x86)\Microsoft Visual Studio 14.0\Web\External\git

"C:\Program Files (x86)\Microsoft Visual Studio 14.0\Web\External\npm.CMD" install
npm WARN install Couldn't install optional dependency: Unsupported
npm WARN EPACKAGEJSON asp.net@0.0.0 No description
npm WARN EPACKAGEJSON asp.net@0.0.0 No repository field.
npm WARN EPACKAGEJSON asp.net@0.0.0 No license field.

How can I be sure it's ignoring that relative path? I can temporarily hardcode my local node_modules, like this. Note the PATH *and* the newer output. And it works.

PATH=D:\github\sample-NerdDinner\NerdDinner.Web\node_modules\.bin;node_modules\.bin;C:\Program Files (x86)\Microsoft Visual Studio 14.0\Web\External;%PATH%;C:\Program Files (x86)\Microsoft Visual Studio 14.0\Web\External\git

"D:\github\sample-NerdDinner\NerdDinner.Web\node_modules\.bin\npm.CMD" install
npm WARN optional Skipping failed optional dependency /chokidar/fsevents:
npm WARN notsup Not compatible with your operating system or architecture: fsevents@1.0.14
npm WARN asp.net@0.0.0 No description
npm WARN asp.net@0.0.0 No repository field.
npm WARN asp.net@0.0.0 No license field.

In this screenshot below I have npm 3.10.6 installed in my project, locally, but it's not being used by Visual Studio if the path remains relative.

I am 99% sure that the relative path ".\node_modules\.bin" that's prepended to the PATH above either isn't being used OR is interfering in some way.

Am I misunderstanding how it should work? Perhaps I don't understand npm enough?

image

I'm going to continue to dig into this and I'll update this post when I have an answer. For now, I have a fix given my globally installed npm.

I've also passed my feedback/bug report onto the Visual Studio team in the form of an email with a link to this blog post. I hope this helps someone!


Sponsor: Big thanks to Redgate for sponsoring the feed this week. Could you deploy 1,000 databases?Imagine working in a 70-strong IT team, with 91 applications and 1,000+ databases. Now imagine deployment time. It’s not fiction, it’s fact. Read FlexiGroup's story.



© 2016 Scott Hanselman. All rights reserved.
     

Visual Studio's most useful (and underused) tips

$
0
0

There was a cool comment in my last blog post (one of many, as always, the comments > the content).

Btw, "until I realized that the Solution Explorer tree nodes are searchable." This one is a saver!

The commenter, Sam, noticed a throwaway bit in the middle of the post where I noted that the Solution Explorer was text-searchable. There's a lot of little tricks like this in Visual Studio that even the most seasoned developers sometimes miss. This phenomenon isn't limited to Visual Studio, of course. It's all software! Folks find non-obvious UX all the time in Windows, OSX, iPhone, everyday. If UX were easy then everything would be intuitive but it's not so it ain't. ;)

There's an old joke about Microsoft Office, which is known for having a zillion features.

"Most of the exciting new Office features you discover have always been in Office." - Me and Everyone Else

Here's some exceedingly useful stuff in Visual Studio (It's free to download and use, BTW) that folks often miss.

Search Solution Explorer with Ctrl+;

You can just click the text box above the Solution Explorer to search all the the nodes - visible or hidden. Or, press "Ctrl + ;"

Ctrl ; will filter the Solution Explorer

Even stuff that's DEEP in the beast. The resulting view is filtered and will remain that way until you clear the search.

Ctrl ; will filter the Solution Explorer and open subnodes

Quick Launch - Ctrl+Q

If there is one feature that no one uses and everyone should use, it's Quick Launch. Someone told me the internal telemetry numbers show that usage of Quick Launch in the single digits or lower.

Do you know that you (we) are constantly digging around in the menus for stuff? Most of you use the mouse and go Tools...Options...and stare.

Just press Ctrl+Q and type. Need to change the Font Size?

Find the Fonts Dialog quickly

Want to Compare Files? Did you know VS had that?

Compare Files

What about finding a NuGet package faster than using the NuGet Dialog?

image

Promise me you'll Ctrl+Q for a few days and see if you can make it a habit. You'll thank yourself.

Map Mode for the Scroll Bar

I love showing people features that totally surprise them. Like "I had NO IDEA that was there" type features. Try "map mode" in the Quick Launch and turn it on...then check out your scroll bar in a large file.

Map Mode for the Scroll Bar

Your scrollbar will turn into a thumbnail that you can hover over and use to navigate your file!

Map Mode turns your Scrollbar into a Scroll Thumbnail

Tab Management

Most folks manage their tabs like this.

  • Open Tab
  • Repeat
  • Declare Tab Bankruptcy
  • Close All Tabs
  • Goto 0

But you DO have both "pinned tabs" and "preview tabs" available.

Pin things you want to keep open

If you pin useful tabs, just like in your browser those tabs will stay to the left and stay open. You can not just "close all" and "close all but this" on a right click, but you can also "close all but pinned."

image

Additionally, you don't always have to double-click in the Solution Explorer to see what's in a file. That just creates a new tab that you're likely going to close anyway. Try just single clicking, or better yet, use your keyboard. You'll get a preview tab on the far right side. You'll never have more than one and preview tabs won't litter your tab list...unless you promote them.

Navigate To - Ctrl+, (Control+Comma)

Absolutely high on the list of useful things is Ctrl+, for NavigateTo. Why click around with your mouse to open a file or find a specific member or function? Press Ctrl+, and start typing. It searches files, members, type...everything. And you can navigate around with your keyboard before you hit enter.

There's basically no reason to poke around in the Solution Explorer if you already know the name of the item you want to see. Ctrl+, is very fast.

image

Move Lines with your keyboard

Yes I realize that Visual Studio isn't Emacs or VIM (unless you want it to be VsVim) but it does have a few tiny tricks that most VS users don't use.

You can move lines just by pressing Alt-up/down arrows. I've never seen anyone do this in the wild but me. You can also Shift-Select a bunch of lines and then Alt-Arrow them around as a group.

Move those lines with ALT-ARROW

You can also do Square Selection with Alt and Drag...and drag yourself a nice rectangle...then start typing to type on a dozen lines at once.

Perhaps you knew these, maybe you learned a few things. I think the larger point is to have the five to ten most useful features right there in your mind ready to go. These are mine. What are your tips?


Sponsor: Do you deploy the same application multiple times for each of your end customers? The team at Octopus have been trying to take the pain out of multi-tenant deployments. Check out their 3.4 beta release.


© 2016 Scott Hanselman. All rights reserved.
     

Announcing PowerShell on Linux - PowerShell is Open Source!

$
0
0

I started doing PowerShell almost 10 years ago. Check out this video from 2007 of me learning about PowerShell from Jeffrey Snover! I worked in PowerShell for many years and blogged extensively,  ultimately using PowerShell to script the automation and creation of a number of large systems in Retail Online Banks around the world.

Fast forward to today and Microsoft is announcing PowerShell on Linux powered by .NET Core and it's all open source and hosted at http://GitHub.com/PowerShell/PowerShell.

Holy crap PowerShell on Linux

Jeffrey Snover predicted internally in 2014 that PowerShell would eventually be open sourced but it was the advent of .NET Core and getting .NET Core working on multiple Linux distros that kickstarted the process. If you were paying attention it's possible you could have predicted this move yourself. Parts of PowerShell have been showing up as open source:

Get PowerShell everywhere

Ok, but where do you GET IT? http://microsoft.com/powershell is the homepage for PowerShell and everything can be found starting from there.

The PowerShell open source project is at https://github.com/PowerShell/PowerShell and there are alphas for Ubuntu 14.04/16.04, CentOS 7.1, and Mac OS X 10.11.

To be clear, I'm told this is are alpha quality builds as work continues with community support. An official Microsoft-supported "release" will come sometime later.

What's Possible?

This is my opinion and somewhat educated speculation, but it seems to me that they want to make it so you can manage anything from anywhere. Maybe you're a Unix person who has some Windows machines (either local or in Azure) that you need to manage. You can use PowerShell from Linux to do that. Maybe you've got some bash scripts at your company AND some PowerShell scripts. Use them both, interchangeably.

If you know PowerShell, you'll be able to use those skills on Linux as well. If you manage a hybrid environment, PowerShell isn't a replacement for bash but rather another tool in your toolkit. There are lots of shells (not just bash, zsh, but also ruby, python, etc) in the *nix world so PowerShell will be in excellent company.

PowerShell on Windows

Related Links

Be sure to check out the coverage from around the web and lots of blog posts from different perspectives!

Have fun! This open source thing is kind of catching on at Microsoft isn't it?


Sponsor: Do you deploy the same application multiple times for each of your end customers? The team at Octopus have been trying to take the pain out of multi-tenant deployments. Check out their 3.4 beta release.



© 2016 Scott Hanselman. All rights reserved.
     

Psychic Weight - Dealing with the things that press on your mind

$
0
0
Close up view of man on his phone 

I was really stressed out ten years ago. I felt that familiar pressure between my eyes and felt like all the things that remained undone were pressing on me. I called it "psychic weight." I have since then collected my Productivity Tips and written extensively on the topic of productivity and getting things done. I'm going to continue to remind YOU that Self-Care Matters in between my technical and coding topics. The essence of what I learned was to let go.

The Serenity Prayer:

God, grant me the serenity to accept the things I cannot change,
Courage to change the things I can,
And wisdom to know the difference.

Everyone has stress and everyone has pressure. There's no magic fix or silver bullet for stress, but I have found that some stressors have a common root cause. Things that stress me are things I think I need to do, handle, watch, take care of, worry about, sweat, deal with, or manage. These things press on me - right between my eyes - and the resulting feeling is what I call psychic weight.

For example: When the DVR (Digital Video Recorder) came out it was a gift from on high. What? A smart VCR that would just tape and hold all the TV Shows that I love? I don't have to watch shows when the time and day the shows come on? Sign me up. What a time saver!

Fast forward a few years and the magical DVR is now an infinite todo list of TV shows. It's a guilt pile. A failure queue. I still haven't watched The Wire. (I know.) It presses on me. I've actually had conversations with my wife like "ok, if we bang out the first season by staying up tonight until 4am, we should be all ready when Season 2 drops next week." Seriously. Yes, I know, Unwatched TV is a silly example. But you've binge-watched Netflix when you should have been working/reading/working out so you can likely relate.

But I'm letting go. I'll watch The Wire another time. I'll delete it from my DVR. I'm never going to watch the second season of Empire. (Sorry, Cookie. I love you Taraji!) I'm not going to read that pile of technical books on my desk. So I'm going to declare that to the universe and I'm going to remove the pile of books that's staring at me. This book stack, this failure pile is no more. And I'm not even mad. I'm OK with it.

Every deletion like this from your life opens up your time - and your mind -for the more important things you need to focus on.

What are your goals? What can you delete from your list (and I mean, DROP, not just postpone) that will free up your internal resources so you can focus on your goal?

Delete those emails. Declare email bankruptcy. They'll likely email you again. Delete a few seasons of shows you won't watch. Delete Pokemon Go. Make that stack of technical books on your desk shorter. Now, what positive thing will you fill those gaps with?

You deserve it. Remove psychic weight and lighten up. Then sound off in the comments!

* Image Copyright Shea Parikh / used under license from getcolorstock.com


Sponsor: Aspose makes programming APIs for working with files, like: DOC, XLS, PPT, PDF and countless more.  Developers can use their products to create, convert, modify, or manage files in almost any way.  Aspose is a good company and they offer solid products.  Check them out, and download a free evaluation.



© 2016 Scott Hanselman. All rights reserved.
     

What is Serverless Computing? Exploring Azure Functions

$
0
0

There's a lot of confusing terms in the Cloud space. And that's not counting the term "Cloud." ;)

  • IaaS (Infrastructure as a Services) - Virtual Machines and stuff on demand.
  • PaaS (Platform as a Service) - You deploy your apps but try not to think about the Virtual Machines underneath. They exist, but we pretend they don't until forced.
  • SaaS (Software as a Service) - Stuff like Office 365 and Gmail. You pay a subscription and you get email/whatever as a service. It Just Works.

"Serverless Computing" doesn't really mean there's no server. Serverless means there's no server you need to worry about. That might sound like PaaS, but it's higher level that than.

Serverless Computing is like this - Your code, a slider bar, and your credit card. You just have your function out there and it will scale as long as you can pay for it. It's as close to "cloudy" as The Cloud can get.

Serverless Computing is like this. Your code, a slider bar, and your credit card.

With Platform as a Service, you might make a Node or C# app, check it into Git, deploy it to a Web Site/Application, and then you've got an endpoint. You might scale it up (get more CPU/Memory/Disk) or out (have 1, 2, n instances of the Web App) but it's not seamless. It's totally cool, to be clear, but you're always aware of the servers.

New cloud systems like Amazon Lambda and Azure Functions have you upload some code and it's running seconds later. You can have continuous jobs, functions that run on a triggered event, or make Web APIs or Webhooks that are just a function with a URL.

I'm going to see how quickly I can make a Web API with Serverless Computing.

I'll go to http://functions.azure.com and make a new function. If you don't have an account you can sign up free.

Getting started with Azure Functions

You can make a function in JavaScript or C#.

Getting started with Azure Functions - Create This Function

Once you're into the Azure Function Editor, click "New Function" and you've got dozens of templates and code examples for things like:

  • Find a face in an image and store the rectangle of where the face is.
  • Run a function and comment on a GitHub issue when a GitHub webhook is triggered
  • Update a storage blob when an HTTP Request comes in
  • Load entities from a database or storage table

I figured I'd change the first example. It is a trigger that sees an image in storage, calls a cognitive services API to get the location of the face, then stores the data. I wanted to change it to:

  • Take an image as input from an HTTP Post
  • Draw a rectangle around the face
  • Return the new image

You can do this work from Git/GitHub but for easy stuff I'm literally doing it all in the browser. Here's what it looks like.

Azure Functions can be done in the browser

I code and iterate and save and fail fast, fail often. Here's the starter code I based it on. Remember, that this is a starter function that runs on a triggered event, so note its Run()...I'm going to change this.

#r "Microsoft.WindowsAzure.Storage"
#r "Newtonsoft.Json"
using System.Net;
using System.Net.Http;
using System.Net.Http.Headers;
using Newtonsoft.Json;
using Microsoft.WindowsAzure.Storage.Table;
using System.IO; 
public static async Task Run(Stream image, string name, IAsyncCollector<FaceRectangle> outTable, TraceWriter log)
{
    var image = await req.Content.ReadAsStreamAsync();
    
    string result = await CallVisionAPI(image); //STREAM
    log.Info(result); 
    if (String.IsNullOrEmpty(result))
    {
        return req.CreateResponse(HttpStatusCode.BadRequest);
    }
    ImageData imageData = JsonConvert.DeserializeObject<ImageData>(result);
    foreach (Face face in imageData.Faces)
    {
        var faceRectangle = face.FaceRectangle;
        faceRectangle.RowKey = Guid.NewGuid().ToString();
        faceRectangle.PartitionKey = "Functions";
        faceRectangle.ImageFile = name + ".jpg";
        await outTable.AddAsync(faceRectangle); 
    }
    return req.CreateResponse(HttpStatusCode.OK, "Nice Job");  
}
static async Task<string> CallVisionAPI(Stream image)
{
    using (var client = new HttpClient())
    {
        var content = new StreamContent(image);
        var url = "https://api.projectoxford.ai/vision/v1.0/analyze?visualFeatures=Faces";
        client.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key", Environment.GetEnvironmentVariable("Vision_API_Subscription_Key"));
        content.Headers.ContentType = new MediaTypeHeaderValue("application/octet-stream");
        var httpResponse = await client.PostAsync(url, content);
        if (httpResponse.StatusCode == HttpStatusCode.OK){
            return await httpResponse.Content.ReadAsStringAsync();
        }
    }
    return null;
}
public class ImageData {
    public List<Face> Faces { get; set; }
}
public class Face {
    public int Age { get; set; }
    public string Gender { get; set; }
    public FaceRectangle FaceRectangle { get; set; }
}
public class FaceRectangle : TableEntity {
    public string ImageFile { get; set; }
    public int Left { get; set; }
    public int Top { get; set; }
    public int Width { get; set; }
    public int Height { get; set; }
}

GOAL: I'll change this Run() and make this listen for an HTTP request that contains an image, read the image that's POSTed in (ya, I know, no validation), draw rectangle around detected faces, then return a new image.

public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log) {

var image = await req.Content.ReadAsStreamAsync();

As for the body of this function, I'm 20% sure I'm using too many MemoryStreams but they are getting disposed so take this code as a initial proof of concept. However, I DO need at least the two I have. Regardless, happy to chat with those who know more, but it's more subtle than even I thought. That said, basically call out to the API, get back some face data that looks like this:

2016-08-26T23:59:26.741 {"requestId":"8be222ff-98cc-4019-8038-c22eeffa63ed","metadata":{"width":2808,"height":1872,"format":"Jpeg"},"faces":[{"age":41,"gender":"Male","faceRectangle":{"left":1059,"top":671,"width":466,"height":466}},{"age":41,"gender":"Male","faceRectangle":{"left":1916,"top":702,"width":448,"height":448}}]}

Then take that data and DRAW a Rectangle over the faces detected.

public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
{
    var image = await req.Content.ReadAsStreamAsync();
    MemoryStream mem = new MemoryStream();
    image.CopyTo(mem); //make a copy since one gets destroy in the other API. Lame, I know.
    image.Position = 0;
    mem.Position = 0;
    
    string result = await CallVisionAPI(image); 
    log.Info(result); 
    if (String.IsNullOrEmpty(result)) {
        return req.CreateResponse(HttpStatusCode.BadRequest);
    }
    
    ImageData imageData = JsonConvert.DeserializeObject<ImageData>(result);
    MemoryStream outputStream = new MemoryStream();
    using(Image maybeFace = Image.FromStream(mem, true))
    {
        using (Graphics g = Graphics.FromImage(maybeFace))
        {
            Pen yellowPen = new Pen(Color.Yellow, 4);
            foreach (Face face in imageData.Faces)
            {
                var faceRectangle = face.FaceRectangle;
                g.DrawRectangle(yellowPen, 
                    faceRectangle.Left, faceRectangle.Top, 
                    faceRectangle.Width, faceRectangle.Height);
            }
        }
        maybeFace.Save(outputStream, ImageFormat.Jpeg);
    }
    
    var response = new HttpResponseMessage()
    {
        Content = new ByteArrayContent(outputStream.ToArray()),
        StatusCode = HttpStatusCode.OK,
    };
    response.Content.Headers.ContentType = new MediaTypeHeaderValue("image/jpeg");
    return response;
}

 

Now I go into Postman and POST an image to my new Azure Function endpoint. Here I uploaded a flattering picture of me and unflattering picture of The Oatmeal. He's pretty in real life just NOT HERE. ;)

Image Recognition with Azure Functions

So in just about 15 min with no idea and armed with just my browser, Postman (also my browser), Google/StackOverflow, and Azure Functions I've got a backend proof of concept.

Azure Functions supports Node.js, C#, F#, Python, PHP *and* Batch, Bash, and PowerShell, which really opens it up to basically anyone. You can use them for anything when you just want a function (or more) out there on the web. Send stuff to Slack, automate your house, update GitHub issues, act as a Webhook, etc. There's some great 3d party Azure Functions sample code in this GitHub repo as well. Inputs can be from basically anywhere and outputs can be basically anywhere. If those anywheres are also cloud services like Tables or Storage, you've got a "serverless backed" that is easy to scale.

I'm still learning, but I can see when I'd want a VM (total control) vs a Web App (near total control) vs a "Serverless" Azure Function (less control but I didn't need it anyway, just wanted a function in the cloud.)


Sponsor: Aspose makes programming APIs for working with files, like: DOC, XLS, PPT, PDF and countless more.  Developers can use their products to create, convert, modify, or manage files in almost any way.  Aspose is a good company and they offer solid products.  Check them out, and download a free evaluation.


© 2016 Scott Hanselman. All rights reserved.
     

Publishing an ASP.NET Core website to a cheap Linux VM host

$
0
0

A little Linux VM on Azure is like $13 a month. You can get little Linux machines all over for between $10-15 a month. On Linode they are about $10 a month so I figured it would be interesting to setup an ASP.NET Core website running on .NET Core. As you may know, .NET Core is free, open source, cross platform and runs basically everywhere.

Step 0 - Get a cheap host

I went to Linode (or anywhere) and got the cheapest Linux machine they offered. In this case it's an Ubuntu 14.04 LTS Profile, 64-bit, 4.6.5 Kernel.

I signed up for a tiny VM at Linode

Since I'm on Windows but I want to SSH into this Linux machine I'll need a SSH client. There's a bunch of options.

Step 0.5 - Setup a user that isn't root

It's always a good idea to avoid being root. After logging into the system as root, I made a new user and give them sudo (super user do):

adduser scott

usermod -aG sudo scott

Then I'll logout and go back in as scott.

Step 1 - Get .NET Core on your Linux Machine

Head over to http://dot.net to get .NET Core and follow the instructions. There's at least 8 Linuxes supported in 6 flavors so you should have no trouble. I followed the Ubuntu instructions.

To make sure it works after you've set it up, make a quick console app like this and run it.

mkdir testapp

cd testapp
dotnet new
dotnet restore
dotnet run

If it runs, then you've got .NET Core installed and you can move on to making a web app and exposing it to the internet.

NOTE: If "dotnet restore" fails with a segmentation fault, you may be running into this issue with some 64-bit Linux Kernels. Here's commands to fix it that worked for me on Ubuntu 14.04 when I hit this. The fix has been released as a NuGet now but it will be included with the next minor release of .NET Core, but if you ever need to manually update the CoreCLR you can.

Step 2 - Make an ASP.NET Core website

You can make an ASP.NET Core website that is very basic and very empty and that's OK. You can also get Yeoman and use the ASP.NET yeoman-based generators to get more choices. There is also the great ASP.NET MVC Boilerplate project for Visual Studio.

Or you can just start with:

dotnet new -t web

Today, this default site uses npm, gulp, and bower to manage JavaScript and CSS dependencies. In the future there will be options that don't require as much extra stuff but for now, in order to dotnet restore this site I'll need npm and what not so I'll do this to get node, npm, etc.

sudo apt-get install npm

sudo npm install gulp
sudo npm install bower

Now I can dotnet restore easily and run my web app to test. It will startup on localhost:5000 usually.

$ dotnet restore

$ dotnet run
scott@ubuntu:~/dotnettest$ dotnet run
Project dotnettest (.NETCoreApp,Version=v1.0) was previously compiled. Skipping compilation.
info: Microsoft.Extensions.DependencyInjection.DataProtectionServices[0]
User profile is available. Using '/home/scott/.aspnet/DataProtection-Keys' as key repository; keys will not be encrypted at rest.
Hosting environment: Production
Content root path: /home/scott/dotnettest
Now listening on: http://localhost:5000

Of course, having something startup on localhost:5000 doesn't help me as I'm over here at home so I can't test a local website like this. I want to expose this site (via a port) to the outside world. I want something like http://mysupermachine -> inside my machine -> localhost:5000.

Step 3 - Expose your web app to the outside.

I could tell Kestrel - that's the .NET Web Server - to expose itself to Port 80, although you usually want to have another process between you and the outside world.

You can do this a few ways. You can open open Program.cs with a editor like "pico" and add a .UseUrls() call to the WebHostBuilder like this.

var host = new WebHostBuilder()

.UseKestrel()
.UseUrls("http://*:80")
.UseContentRoot(Directory.GetCurrentDirectory())
.UseStartup<Startup>()
.Build();

Here the * binds to all the network adapters and it listens on Port 80. Putting http://0.0.0.0:80 also works.

You might have permission issues doing this and need to elevate the dotnet process and webserver which is also a problem so let's just keep it at a high internal port and reverse proxy the traffic with something like Nginx or Apache. We'll pull out the hard-coded port from the code and change the Program.cs to use a .json config file.

public static void Main(string[] args)

{
var config = new ConfigurationBuilder()
.SetBasePath(Directory.GetCurrentDirectory())
.AddJsonFile("hosting.json", optional: true)
.Build();

var host = new WebHostBuilder()
.UseKestrel()
.UseConfiguration(config)
.UseContentRoot(Directory.GetCurrentDirectory())
.UseStartup<Startup>()
.Build();

host.Run();
}

The hosting.json file is just this:

{

"server.urls": "http://localhost:5123"
}

We can also use "AddCommandLine(args) instead of "AddJsonFile()" and pass in --server.urls=http://*:5123 on the command line. It's up to you. You can also use the ASPNETCORE_URLS environment variable.

NOTE: I'm doing this work a folder under my home folder ~ or now. I'll later compile and "publish" this website to something like /var/dotnettest when I want it seen.

Step 4 - Setup a Reverse Proxy like Nginx

I'm following the detailed instructions at the ASP.NET Core Docs site called "Publish to a Linux Production Environment." (All the docs are on GitHub as well)

I'm going to bring in Nginx and start it.

sudo apt-get install nginx

sudo service nginx start

I'm going to change the default Nginx site to point to my (future) running ASP.NET Core web app. I'll open and change /etc/nginx/sites-available/default and make it look like this. Note the port number. Nginx is a LOT more complex than this and has a lot of nuance, so when you are ready to go into Super Official Production, be sure to explore what the perfect Nginx Config File looks like and change it to your needs.

server {

listen 80;
location / {
proxy_pass http://localhost:5123;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection keep-alive;
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
}

Then we'll check it and reload the config.

sudo nginx -t 

sudo nginx -s reload

Step 5 - Keep your website running

The website isn't up and running on localhost:5123 yet (unless you've run it yourself and kept it running!) so we'll need an app or a monitor to run it and keep it running. There's an app called Supervisor that is good at that so I'll add it.

sudo apt-get install supervisor

Here is where you/we/I/errbody needs to get the paths and names right, so be aware. I'm over in ~/testapp or something. I need to publish my site into a final location so I'm going to run dotnet publish, then copy the reuslts into /var/dotnettest where it will live.

dotnet publish

publish: Published to /home/scott/dotnettest/bin/Debug/netcoreapp1.0/publish
sudo cp -a /home/scott/dotnettest/bin/Debug/netcoreapp1.0/publish /var/dotnettest

Now I'm going to make a file (again, I use pico because I'm not as awesome as emacs or vim) called /src/supervisor/conf.d/dotnettest.conf to start my app and keep it running:

[program:dotnettest]

command=/usr/bin/dotnet /var/dotnettest/dotnettest.dll --server.urls:http://*:5123
directory=/var/dotnettest/
autostart=true
autorestart=true
stderr_logfile=/var/log/dotnettest.err.log
stdout_logfile=/var/log/dotnettest.out.log
environment=ASPNETCORE_ENVIRONMENT=Production
user=www-data
stopsignal=INT

Now we start and stop Supervisor and watch/tail its logs to see our app startup!

sudo service supervisor stop

sudo service supervisor start
sudo tail -f /var/log/supervisor/supervisord.log
#and the application logs if you like
sudo tail -f /var/log/dotnettest.out.log

If all worked out (if it didn't, it'll be a name or a path so keep trying!) you'll see the supervisor log with dotnet starting up, running your app.

Hey it's dotnet on linux

Remember the relationships.

  • Dotnet - runs your website
  • Nginx or Apache - Listens on Port 80 and forwards HTTP calls to your website
  • Supervisor - Keeps your app running

Next, I might want to setup a continuous integration build, or SCP/SFTP to handle deployment of my app. That way I can develop locally and push up to my Linux machine.

Hey it's my ASP.NET Core app on Linux

Of course, there are a dozen other ways to publish an ASP.NET Core site, not to mention Docker. I'll post about Docker another time, but for now, I was able to get my ASP.NET Core website published to a cheap $10 host in less than an hour. You can use the same tools to manage a .NET Core site that you use to manage any site be it PHP, nodejs, Ruby, or whatever makes you happy.


Sponsor: Aspose makes programming APIs for working with files, like: DOC, XLS, PPT, PDF and countless more.  Developers can use their products to create, convert, modify, or manage files in almost any way.  Aspose is a good company and they offer solid products.  Check them out, and download a free evaluation.



© 2016 Scott Hanselman. All rights reserved.
     

How to deal with Technology Burnout - Maybe it's life's cycles

$
0
0
Burnout photo by Michael Himbeault used under cc

Sarah Mei had a great series of tweets last week. She's a Founder of RailsBridge, Director of Ruby Central, and the Chief Consultant of DevMynd so she's experienced with work both "on the job" and "on the side." Like me, she organizes OSS projects, conferences, but she also has a life, as do I.

If you're reading this blog, it's likely that you have gone to a User Group or Conference, or in some way did some "on the side" tech activity. It could be that you have a blog, or you tweet, or you do videos, or you volunteer at a school.

With Sarah's permission, I want to take a moment and call out some of these tweets and share my thoughts about them. I think this is an important conversation to have.

This is vital. Life is cyclical. You aren't required or expected to be ON 130 hours a week your entire working life. It's unreasonable to expect that of yourself. Many of you have emailed me about this in the past. "How do you do _____, Scott?" How do you deal with balance, hang with your kids, do your work, do videos, etc.

I don't.

Sometimes I just chill. Sometimes I play video games. Last week I was in bed before 10pm two nights. I totally didn't answer email that night either. Balls were dropped and the world kept spinning.

Sometimes you need to be told it's OK to stop, Dear Reader. Slow down, breathe. Take a knee. Hell, take a day.

Here's where it gets really real. We hear a lot about "burnout." Are you REALLY burnt? Maybe you just need to chill. Maybe going to three User Groups a month (or a week!) is too much? Maybe you're just not that into the tech today/this week/this month. Sometimes I'm so amped on 3D printing and sometimes I'm just...not.

Am I burned out? Nah. Just taking in a break.

Whatever you're working on, likely it will be there later. Will you?

Is your software saving babies? If so, kudos, and please, keep doing whatever you're doing! If not, remember that. Breathe and remember that while the tech is important, so are you and those around you. Take care of yourself and those around you. You all work hard, but are you paying yourself first?

You're no good to us dead.

I realize that not everyone with children in their lives can get/afford a sitter but I do also want to point out that if you can, REST. RESET. My wife and I have Date Night. Not once a month, not occasionally. Every week. As we tell our kids: We were here before you and we'll be here after you leave, so this is our time to talk to each other. See ya!

Thank you, Sarah, for sharing this important reminder with us. Cycles happen.

Related Reading

* Burnout photo by Michael Himbeault used under CC



© 2016 Scott Hanselman. All rights reserved.
     

Putting (my VB6) Windows Apps in the Windows 10 Store - Project Centennial

$
0
0

Evernote in the Windows 10 Store with Project CentennialI noticed today that Evernote was in the Windows Store. I went to the store, installed Evernote, and it ran. No nextnextnextfinish-style install, it just worked and it worked nicely. It's a Win32 app and it appears to use NodeWebKit for part of it's UI. But it's a Windows app, just like VB6 apps and just like .NET apps and just like UWP (Universal Windows Platform) apps, so I found this to be pretty cool. Now that the Evernote app is a store app it can use Windows 10 specific features like Live Tiles and Notifications and it'll be always up to date.

The Windows Store is starting (slowly) to roll out and include existing desktop apps and games by building and packaging those apps using the Universal Windows Platform. This was called "Project Centennial" when they announced it at the BUILD conference. It lets you basically put any Windows App in the Windows Store, which is cool. Apps that live there are safe, won't mess up your machine, and are quickly installed and uninstalled.

Here's some of the details about what's happening with your app behind the scenes, from this article. This is one of the main benefits of the Windows Store. Apps from the Store can't mess up your system on a global scale.

[The app] runs in a special environment where any accesses that the app makes to the file system and to the Registry are redirected. The file named Registry.dat is used for Registry redirection. It's actually a Registry hive, so you can view it in the Windows Registry Editor (Regedit). When it comes to the file system, the only thing redirected is the AppData folder, and it is redirected to the same location that app data is stored for all UWP apps. This location is known as the local app data store, and you access it by using the ApplicationData.LocalFolderproperty. This way, your code is already ported to read and write app data in the correct place without you doing anything. And you can also write there directly. One benefit of file system redirection is a cleaner uninstall experience.

The "DesktopAppConverter" is now packaged in the Windows Store as well, even though it runs at the command prompt! If your Windows Desktop app has a "silent installer" then you can run this DesktopAppConvertor on your installer to make an APPX package that you can then theoretically upload to the Store.

NOTE: This "Centennial" technology is in Windows 10 AU, so if you haven't auto-updated yet, you can get AU now.

They are also working with install vendors like InstallShield and WiX to so their installation creation apps will create Windows Store apps with the Desktop Bridge automatically. This way your existing MSIs and stuff can turn into UWP packages and live in the store.

DesktopAppConverter

It looks like there are a few ways to make your existing Windows apps into Windows 10 Store-ready apps. You can use this DesktopAppConverter and run it in your existing  silent installer. Once you've made your app a Store App, you can "light up" your app with Live Tiles and Notifications and  other features with code. Check out the https://github.com/Microsoft/DesktopBridgeToUWP-Samples GitHub Repro with samples that show you how to add Tiles or Background tasks. You can use [Conditional("DesktopUWP")] compilation if you have both a Windows Store and Windows desktop version of your app with a traditional installer.

If your app is a simple Xcopy-deploy app that has no installer, it's even easier. To prove this I installed Visual Basic 6 on my Windows 10 machine. OH YES I DID.

NOTE: I am using VB6 as a fun but also very cool example. VB6 is long out of support but apps created with it still run great on Windows because they are win32 apps. For me, this means that if I had a VB6 app that I wanted to move into the Store and widen my audience, I could.

I made a quick little Project1.exe in VB6 that runs on its own.

Visual Basic 6 on Windows 10

I made an AppxManifest.xml with these contents following this HelloWorld sample.

<?xml version="1.0" encoding="utf-8"?>

<Package
xmlns="http://schemas.microsoft.com/appx/manifest/foundation/windows10"
xmlns:uap="http://schemas.microsoft.com/appx/manifest/uap/windows10"
xmlns:rescap="http://schemas.microsoft.com/appx/manifest/foundation/windows10/restrictedcapabilities">
<Identity Name="HanselmanVB6"
ProcessorArchitecture="x64"
Publisher="CN=HanselmanVB6"
Version="1.0.0.0" />
<Properties>
<DisplayName>Scott Hanselman uses VB6</DisplayName>
<PublisherDisplayName>Reserved</PublisherDisplayName>
<Description>I wish there was a description entered</Description>
<Logo>Assets\Logo.png</Logo>
</Properties>
<Resources>
<Resource Language="en-us" />
</Resources>
<Dependencies>
<TargetDeviceFamily Name="Windows.Desktop" MinVersion="10.0.14316.0" MaxVersionTested="10.0.14316.0" />
</Dependencies>
<Capabilities>
<rescap:Capability Name="runFullTrust"/>
</Capabilities>
<Applications>
<Application Id="HanselmanVB6" Executable="Project1.exe" EntryPoint="Windows.FullTrustApplication">
<uap:VisualElements
BackgroundColor="#464646"
DisplayName="Hey it's VB6"
Square150x150Logo="Assets\SampleAppx.150x150.png"
Square44x44Logo="Assets\SampleAppx.44x44.png"
Description="Hey it's VB6" />
</Application>
</Applications>
</Package>

In the folder is my Project1.exe long with an Assets folder with my logo and a few PNGs.

Now I can run the DesktopAppConverter if I have a quiet installer, but since I've just got a small xcopyable app, I'll run this to test on my local machine.

Add-AppxPackage -register .\AppxManifest.xml

And now my little VB6 app is installed locally and in my Start Menu.

VB6 as a Windows App

When I am ready to get my app ready for production and submission to the Store I'll follow the guidance and docs here and use Visual Studio, or just do the work manually at the command line with the MakeAppx and SignTool utilities.

"C:\Program Files (x86)\Windows Kits\10\bin\x86\makeappx" pack /d . /p Project1.appx

Later I'll buy a code signing cert, but for now I'll make a fake local one, trust it, and make a pfx cert.

"C:\Program Files (x86)\Windows Kits\10\bin\x86\makecert" /n "CN=HanselmanVB6" /r /pe /h /0 /eku "1.3.6.1.5.5.7.3.3,1.3.6.1.4.1.311.10.3.13" /e 12/31/2016 /sv MyLocalKey1.pvk MyLocalKey1.cer

"C:\Program Files (x86)\Windows Kits\10\bin\x86\pvk2pfx" -po -pvk MyLocalKey1.pvk -spc MyLocalKey1.cer -pfx MyLocalKey1.pfx
certutil -user -addstore Root MyLocalKey1.cer

Now I'll sign my Appx.

NOTE: Make sure the Identity in the AppxManifest matches the code signing cert's CN=Identity. That's the FULL string from the cert. Otherwise you'll see weird stuff in your Event Viewer in Microsoft|Windows\AppxPackagingOM|Microsoft-Windows-AppxPackaging/Operational like "error 0x8007000B: The app manifest publisher name (CN=HanselmanVB6, O=Hanselman, L=Portland, S=OR, C=USA) must match the subject name of the signing certificate exactly (CN=HanselmanVB6)."

I'll use a command line like this. Remember that Visual Studio can hide a lot of this, but since I'm doing it manually it's good to understand the details.

"C:\Program Files (x86)\Windows Kits\10\bin\x86\signtool.exe" sign /debug /fd SHA256 /a /f MyLocalKey1.pfx Project1.appx


The following certificates were considered:
Issued to: HanselmanVB6
Issued by: HanselmanVB6
Expires: Sat Dec 31 00:00:00 2016
SHA1 hash: 19F384D1D0BD33F107B2D7344C4CA40F2A557749

After EKU filter, 1 certs were left.
After expiry filter, 1 certs were left.
After Private Key filter, 1 certs were left.
The following certificate was selected:
Issued to: HanselmanVB6
Issued by: HanselmanVB6
Expires: Sat Dec 31 00:00:00 2016
SHA1 hash: 19F384D1D0BD33F107B2D7344C4CA40F2A557749


The following additional certificates will be attached:
Done Adding Additional Store
Successfully signed: Project1.appx

Number of files successfully Signed: 1
Number of warnings: 0
Number of errors: 0

Now I've got a (local developer) signed, packaged Appx that has a VB6 app inside it. If I double click I'll get the Appx installer, but what I really want to do is sign it with a real cert and put it in the Windows Store!

VB6 in the Windows Store

Here's the app running. Pretty amazing UX, I know.

VB6 app as a Windows Store App

It's early days, IMHO, but I'm looking forward to a time when I can go to the Windows Store and get my favorite apps like Windows Open Live Writer, Office, Slack, and more! Now's the time for you to start exploring these tools.

Related Links


Sponsor: Big thanks to Redgate for sponsoring the feed this week. Discover the world’s most trusted SQL Server comparison tool. Enjoy a free trial of SQL Compare, the industry standard for comparing and deploying SQL Server schemas.



© 2016 Scott Hanselman. All rights reserved.
     

Self-contained .NET Core Applications

$
0
0

Just in case you missed it, .NET is all open source now and .NET Core is a free, open source, cross-platform framework that you can download and start with in <10 minutes. You can get it on Mac, Windows, and a half-dozen Unixes at http://dot.net. Take that along with the free, cross-platform Visual Studio Code and you'll be writing C# and F# all over the place.

Ok, that said, there's two ways to deploy a .NET Core application. There's FDD and SCD. Since TLAs (three letter acronyms) are stupid, that's Framework-dependent and Self-contained. When .NET Core is installed it ends up in C:\program files\dotnet on Windows, for example. In the "Shared" folder there's a bunch of .NET stuff that is, well, shared. There may be multiple folders, as you can see in my folder below. You can have many and multiple installs of .NET Core.

When you're installing your app and its dependencies BUT NOT .NET Core itself, you're dependent on .NET Core already being on the target machine. That's fine for Web Apps or systems with lots of apps, but what if I want to write an app and give it to you as zip or on a USB key and and I just want it to work. I'll include .NET Core as well so the whole thing is a Self Contained Deployment.

It will make my "Hello World" application larger than if I was using an existing system-wide install, but I know it'll Just Work because it'll be totally self-contained.

Where is .NET Core installed to?

If I deploy my app along with .NET Core it's important to remember that I'll be responsible for servicing .NET Core and keeping it up to date. I also need to decide on my target platforms ahead of time. If I want it to run on Windows, Mac, and Linux AND just work, I'll need to include those target platforms and build deployment packages for them. This all makes mostly intuitive sense but it's good to know.

I'll take my little app (I'm just using a "dotnet new" app) and I'll modify project.json in a text editor.

My app is a .NETCore.App, but it's not going to use the .NET Core platform that's installed. It'll use a local version so I'll remove "type="platform"" from this dependency.

"frameworks": {

"netcoreapp1.0": {
"dependencies": {
"Microsoft.NETCore.App": {
"version": "1.0.1"
}
}
}
}

Next I'll make a runtimes section to specify which ones I want to target. There's a list of ALL the Runtime IDs here.

"runtimes": {

"win10-x64": {},
"osx.10.10-x64": {},
"ubuntu.14.04-x64": {}
}

After running "dotnet restore" you'll want to build for each of these like this:

dotnet build -r win10-x64

dotnet build -r osx.10.10-x64
dotnet build -r ubuntu.14.04-x64

And then publish release versions after you've tested, etc.

dotnet publish -c release -r win10-x64

dotnet publish -c release -r osx.10.10-x64
dotnet publish -c release -r ubuntu.14.04-x64

Once this is done, I've got my app self-contained in n folders, ready to deploy to whatever systems I want.

Self-contained .NET app built for 3 OSs

You can see in the Win10 folder there's my "MYAPPLICATION.exe" (mine is called scd.exe) that can be run, rather than running things like developers do with "dotnet run."

I run foo.exe, not dotnet.exe now

There's lots of good documentation about how you can tune and define exactly what gets deployed with your self contained application over at the .NET Core Docs. You can do considerable trimming to .NET Core, and there's talk of that becoming more and more automated in the future, possibly down to the method level.


Sponsor: Big thanks to Redgate for sponsoring the feed this week. Discover the world’s most trusted SQL Server comparison tool. Enjoy a free trial of SQL Compare, the industry standard for comparing and deploying SQL Server schemas.



© 2016 Scott Hanselman. All rights reserved.
     

FIXED: Xbox One losing TV signal error message with DirectTV

$
0
0

Your TV signal was lostI've got an Xbox One that I love that is connected to a DirectTV HDTV Receiver that I love somewhat less. The setup is quite simple. Since I can control the DirectTV with the Xbox One and we like to switch between Netflix and Hulu and DirectTV we use the Xbox One to control everything.

The basic idea is this, which is quite typical with an Xbox One. In theory, it's amazing.

I fixed my Xbox One losing Signal with an HDMI powered splitter

However, this doesn't always work. Often you'll turn on the whole system and the Xbox will say "Your TV Signal was lost. Make sure your cable or satellite box is on and plugged into the Xbox." This got so bad in our house that my non-technical spouse was ready to "buy a whole new TV." I was personally blaming the Xbox.

It turns out that's an issue of HDMI compliance. The DirectTV and other older cable boxes aren't super awesome about doing things the exact way HDMI like it, and the Xbox is rather picky about HDMI being totally legit. So how do I "clean" or "fix" my HDMI signal from my Cable/Satellite receiver?

I took at chance and asked on Reddit and this very helpful user (thanks!) suggested an HDMI splitter. I was surprised but I was ready to try anything so I ordered this 2 port HDMI powered splitter from Amazon for just US$20.

ADDING AN HDMI SPLITTED WORKS - TOTALLY SOLVED THE PROBLEM

It totally works. The Xbox One now does its "negotiations" with the compliant splitter, not with the Receiver directly and we haven't seen a single problem since.

I fixed my Xbox One losing Signal with an HDMI powered splitter

If you have had this problem with your Xbox One then pick up a 2 port HDMI powered splitter and rejoice. This is a high quality splitter than doesn't change the audio signal and still works with HDCP if needed. Thanks internets!


Sponsor: Big thanks to Telerik for sponsoring the feed this week. Try Kendo UI by Progress: The most complete set of HTML5 UI widgets and JavaScript app tools helping you cut development time.



© 2016 Scott Hanselman. All rights reserved.
     

Lonely Coding

$
0
0
codingisbettertogether

It's official. I'm a better programmer when I'm pairing with someone. Pair Programming (two people, one keyboard) has been around for at least 20+ years, if not much longer. Usually one person types while another person (paces around and) thinks. It is kind of a "driver and navigator" model.

Everyone is different, to be clear, so it's very possible that you are the kind of person who can disappear into a closet for 8 hours and emerge with code, triumphant. I've done this before. Some of my best projects have involved me coding alone and that's fine.

However, just has we know that "diverse teams make for better projects," the same is true in my experience when coding on specific problems. Diversity isn't just color and gender, etc, it's as much background, age, personal history, work experience, expertise, programming language of choice, heck, it's even google-ability, and more!

How many times have you banged your head against a wall while coding only to have a friend or co-worker find the answer on their first web search?

Good pair programming is like that. Those ah-ha moments happen more often and you'll feel more than twice as productive in a pair.

In fact, I'm trying to pair for an hour every week remotely. Mark Downie and I have been pairing on DasBlog on and off for a year or so now in fits and starts. It's great. Just last week he and I were trying to crack one problem using regular expressions (yes, then we had two problems) and because there were two of us looking at the code it was solved!

Why is pair programming better?

Here's a few reasons why I think Pair Programming is very often better.

  • Focus and Discipline - We set aside specific times and we sprint. We don't chat, we don't delete email, we code. And we also code with a specific goal or endpoint in mind.
  • Collective ownership - I feel like we own the code together. I feel less ego about the code. Our hacks are our hacks, and our successes are shared.
  • Personal growth - We mentor each other. We learn and we watch how the other moves around the code. I've learned new techniques, new hotkeys, and new algorithms.

Let's talk about the remote aspect of things. I'm remote. I also like to poke around on non-work-related tech on the side, as do many of us. Can I pair program remotely as well? Absolutely. I start with Skype, but I also use Google Hangouts, Join.me, TeamViewer, whatever works that day.

If you're a remote person on a larger team, consider remote pair programming. If you're an consultant  or perhaps you've left a big corporate job to strike off on your own, you might be lonely. Seriously, ask yourself that hard question. It's no fun to realize or have to declare you're a lonely coder, but I am and I will. I love my job and I love my team but if I go a day or two without seeing another human or spending some serious time on Skype I get really tense. Remote pair programming can really reduce that feeling of lonely coding.

I was at a small tech get together in Atlanta a few days ago and I knew that one person there was a singular coder at their small business while another at the table was an emerging college student with an emerging talent. I made a gentle suggestion that maybe they consider pairing up on some side projects and they both lit up.

Consider your networks. Are there people you've met at conferences or at local user groups or meetups that might be good remote pairing partners? This might be the missing link for you. It was for me!

Do you pair? Do you pair remotely? Let us all know in the comments.

* Stock photo purchased from ColorStock - Your customers are diverse, why aren't your stock photos?


Sponsor: Big thanks to Telerik for sponsoring the feed this week. Try Kendo UI by Progress: The most complete set of HTML5 UI widgets and JavaScript app tools helping you cut development time.



© 2016 Scott Hanselman. All rights reserved.
     

Sharing Authorization Cookies between ASP.NET 4.x and ASP.NET Core 1.0

$
0
0

ASP.NET Core 1.0 runs on ASP.NET 4.6 nicelyASP.NET Core 1.0 is out, as is .NET Core 1.0 and lots of folks are making great cross-platform web apps. These are Web Apps that are built on .NET Core 1.0 and run on Windows, Mac, or Linux.

However, some people don't realize that ASP.NET Core 1.0 (that's the web framework bit) runs on either .NET Core or .NET Framework 4.6 aka "Full Framework."

Once you realize that it can be somewhat liberating. If you want to check out the new ASP.NET Core 1.0 and use the unified controllers to make web apis or MVC apps with Razor you can...even if you don't need or care about cross-platform support. Maybe your libraries use COM objects or Windows-specific stuff. ASP.NET Core 1.0 works on .NET Framework 4.6 just fine.

Another option that folks don't consider when talk of "porting" their apps comes up at work is - why not have two apps? There's no reason to start a big porting exercise if your app works great now. Consider that you can have a section of your site by on ASP.NET Core 1.0 and another be on ASP.NET 4.x and the two apps could share authentication cookies. The user would never know the difference.

Barry Dorrans from our team looked into this, and here's what he found. He's interested in your feedback, so be sure to file issues on his GitHub Repo with your thoughts, bugs, and comments. This is a work in progress and at some point will be updated into the official documentation.

Sharing Authorization Cookies between ASP.NET 4.x and .NET Core

Barry is building a GitHub repro here with two sample apps and a markdown file to illustrate clearly how to accomplish cookie sharing.

When you want to share logins with an existing ASP.NET 4.x app and an ASP.NET Core 1.0 app, you'll be creating a login cookie that can be read by both applications. It's certainly possible for you, Dear Reader, to "hack something together" with sessions and your own custom cookies, but please let this blog post and Barry's project be a warning. Don't roll your own crypto. You don't want to accidentally open up one or both if your apps to hacking because you tried to extend auth/auth in a naïve way.

First, you'll need to make sure each application has the right NuGet packages to interop with the security tokens you'll be using in your cookies.

Install the interop packages into your applications.

  1. ASP.NET 4.5

    Open the nuget package manager, or the nuget console and add a reference to Microsoft.Owin.Security.Interop.

  2. ASP.NET Core

    Open the nuget package manager, or the nuget console and add a reference to Microsoft.AspNetCore.DataProtection.Extensions.

Make sure the Cookie Names are identical in each application

Barry is using CookieName = ".AspNet.SharedCookie" in the example, but you just need to make sure they match.

services.AddIdentity<ApplicationUser, IdentityRole>(options =>

{
options.Cookies = new Microsoft.AspNetCore.Identity.IdentityCookieOptions
{
ApplicationCookie = new CookieAuthenticationOptions
{
AuthenticationScheme = "Cookie",
LoginPath = new PathString("/Account/Login/"),
AccessDeniedPath = new PathString("/Account/Forbidden/"),
AutomaticAuthenticate = true,
AutomaticChallenge = true,
CookieName = ".AspNet.SharedCookie"

};
})
.AddEntityFrameworkStores<ApplicationDbContext>()
.AddDefaultTokenProviders();
}

Remember the CookieName property must have the same value in each application, and the AuthenticationType (ASP.NET 4.5) and AuthenticationScheme (ASP.NET Core) properties must have the same value in each application.

Be aware of your cookie domains if you use them

Browsers naturally share cookies between the same domain name. For example if both your sites run in subdirectories under https://contoso.com then cookies will automatically be shared.

However if your sites run on subdomains a cookie issued to a subdomain will not automatically be sent by the browser to a different subdomain, for example, https://site1.contoso.com would not share cookies with https://site2.contoso.com.

If your sites run on subdomains you can configure the issued cookies to be shared by setting the CookieDomain property in CookieAuthenticationOptions to be the parent domain.

Try to do everything over HTTPS and be aware that if a Cookie has its Secure flag set it won't flow to an insecure HTTP URL.

Select a common data protection repository location accessible by both applications

From Barry's instructions, his sample will use a shared DP folder, but you have options:

This sample will use a shared directory (C:\keyring). If your applications aren't on the same server, or can't access the same NTFS share you can use other keyring repositories.

.NET Core 1.0 includes key ring repositories for shared directories and the registry.

.NET Core 1.1 will add support for Redis, Azure Blob Storage and Azure Key Vault.

You can develop your own key ring repository by implementing the IXmlRepository interface.

Configure your applications to use the same cookie format

You'll configure each app - ASP.NET 4.5 and ASP.NET Core - to use the AspNetTicketDataFormat for their cookies.

Cookie Sharing with ASP.NET Core and ASP.NET Full Framework

According to his repo, this gets us started with Cookie Sharing for Identity, but there still needs to be clearer guidance on how share the Identity 3.0 database between the two frameworks.

The interop shim does not enabling the sharing of identity databases between applications. ASP.NET 4.5 uses Identity 1.0 or 2.0, ASP.NET Core uses Identity 3.0. If you want to share databases you must update the ASP.NET Identity 2.0 applications to use the ASP.NET Identity 3.0 schemas. If you are upgrading from Identity 1.0 you should migrate to Identity 2.0 first, rather than try to go directly to 3.0.

Sound off in the Issues over on GitHub if you would like to see this sample (or another) expanded to show more Identity DB sharing. It looks to be very promising work.


Sponsor: Big thanks to Telerik for sponsoring the blog this week! 60+ ASP.NET Core controls for every need. The most complete UI toolset for x-platform responsive web and cloud development.Try now 30 days for free!



© 2016 Scott Hanselman. All rights reserved.
     

How to reference an existing .NET Framework Project in an ASP.NET Core 1.0 Web App

$
0
0

I had a reader send me a question yesterday. She basically wanted to use her existing .NET Framework libraries in an ASP.NET Core application, and it wasn't super clear how to do it.

I have a quick question for you regarding asp.net core. We are rewriting our website using asp.net core, empty from the bottom up. We have 2 libraries written in .net 4.6 . One is our database model and repositories and the other is a project of internal utilities we use a lot. Unfortunately we cannot see how to reference these two projects in our .net core project.

It can be a little confusing. As I mentioned earlier this week, some people don't realize that ASP.NET Core 1.0 (that's the web framework bit) runs on either .NET Core or .NET Framework 4.6 aka "Full Framework."

ASP.NET Core 1.0 runs on ASP.NET 4.6 nicely

When you make a new web project in Visual Studio you see (today) this dialog. Note in the dropdown at the top you can select your minimum .NET Framework version. You can select 4.6.2, if you like, but I'll do 4.5.2 to be a little more compatible. It's up to you.

File New Project

This dialog could use clearer text and hopefully it will soon.

  • There's the regular ASP.NET Web Application at the top. That's ASP.NET 4.6 with MVC and Web API. It runs on the .NET Framework.
  • There's ASP.NET Core 1.0 running on .NET Core. That's cross platform. If you select that one you'll be able to run your app anywhere but you can't reference "Full" .NET Framework assemblies as they are just for Windows.  If you want to run anywhere you need to use .NET Standard APIs that will run anywhere.
  • There's ASP.NET Core 1.0 running on .NET Framework. That's the new ASP.NET Core 1.0 with unified MVC and Web API but running on the .NET Framework you run today on Windows.

As we see in the diagram above, ASP.NET Core 1.0 is the new streamlined ASP.NET  that can run on top of both .NET Framework (Windows) and .NET Core (Mac/Windows/Linux).

I'll chose ASP.NET Core on .NET Framework and I'll see this in my Solution Explorer:

Web App targeting .NET Framework 4.5.2

I've got another DLL that I made with the regular File | New Project | Class Library.

New Class Library

Then I reference it the usual way with Add Reference and it looks like this in the References node in Solution Explorer. Note the icon differences.

Adding ClassLibrary1 to the References Node in Solution Explorer

If we look in the project.json (Be aware that this will change for the better in the future when project.json's functionality is merged with csproj and msbuild) you'll note that the ClassLIbrary1 isn't listed under the top level dependencies node, but as a framework specific dependency like this:

{

"dependencies": {
"Microsoft.StuffAndThings": "1.0.0",
"Microsoft.AspNetCore.Mvc": "1.0.1",
},

"frameworks": {
"net452": {
"dependencies": {
"ClassLibrary1": {
"target": "project"
}
}
}
}
}

Notice also that in this case it's a type="project" dependency in this case as I didn't build a NuGet package and reference that.

Since the .NET Core tooling is in preview there are some gotchas when doing this today.

  • Make sure that all your class libraries are targeting an appropriate version of the .NET Framework.
    • For example, don't have a 4.5.2 Web App targeting a 4.6.2 Class Library. This could bite you in subtle ways if things don't line up in production.
  • dotnet restore at the command line may well get confused and give you an error like:
    • Errors in D:\github\WebApplication2\src\WebApplication2\project.json - Unable to resolve 'ClassLibrary1' for '.NETFramework,Version=v4.5.2'.
    • Use Visual Studio to build or run msbuild at the command line.
  • You can restore packages from the command line with nuget.exe version 3.4.4 or greater if you restore on the .sln file like this:
    • D:\github\WebApplication2>nuget restore WebApplication2.sln
      MSBuild auto-detection: using msbuild version '14.0' from 'C:\Program Files (x86)\MSBuild\14.0\bin'.
    • I recommend you run nuget.exe to see what version you have and run nuget update -self to have it update itself.

These gotchas will be fixed when the tooling is finalized.

Hope this helps!


Sponsor: Big thanks to Telerik for sponsoring the blog this week! 60+ ASP.NET Core controls for every need. The most complete UI toolset for x-platform responsive web and cloud development. Try now 30 days for free!



© 2016 Scott Hanselman. All rights reserved.
     

Exploring ASP.NET Core with Docker in both Linux and Windows Containers

$
0
0

In May of last year doing things with ASP.NET and Docker was in its infancy. But cool stuff was afoot. I wrote a blog post showing how to publish an ASP.NET 5 (5 at the time, now Core 1.0) app to Docker. Later in December of 2015 new tools like Docker Toolbox and Kitematic made things even easier. In May of 2016 Docker for Windows Beta continued to move the ball forward nicely.

I wanted to see how things are looking with ASP.NET Core, Docker, and Windows here in October of 2016.

I installed these things:

Docker for Windows is really nice as it automates setting up Hyper-V for you and creates the Docker host OS and gets it all running. This is a big time saver.

Hyper-V manager

There's my Linux host that I don't really have to think about. I'll do everything from the command line or from Visual Studio.

I'll say File | New Project and make a new ASP.NET Core application running on .NET Core.

Then I right click and Add | Docker Support. This menu comes from the Visual Studio Tools for Docker extension. This adds a basic Dockerfile and some docker-compose files. Out of the box, I'm all setup to deploy my ASP.NET Core app to a Docker Linux container.

ASP.NET Core in a Docker Linux Container

Starting from my ASP.NET Core app, I'll make sure my base image (that's the FROM in the Dockerfile) is the base ASP.NET Core image for Linux.

FROM microsoft/aspnetcore:1.0.1

ENTRYPOINT ["dotnet", "WebApplication4.dll"]
ARG source=.
WORKDIR /app
EXPOSE 80
COPY $source .

Next, since I don't want Docker to do the building of my application yet, I'll publish it locally. Be sure to read Steve Lasker's blog post "Building Optimized Docker Images with ASP.NET Core" to learn how to have one docker container build your app and the other run it it. This optimizes server density and resource.

I'll publish, then build the images, and run it.

>dotnet publish


>docker build bin\Debug\netcoreapp1.0\publish -t aspnetcoreonlinux

>docker images
REPOSITORY TAG IMAGE ID CREATED SIZE
aspnetcoreonlinux latest dab2bff7e4a6 28 seconds ago 276.2 MB
microsoft/aspnetcore 1.0.1 2e781d03cb22 44 hours ago 266.7 MB

>docker run -it -d -p 85:80 aspnetcoreonlinux
1cfcc8e8e7d4e6257995f8b64505ce25ae80e05fe1962d4312b2e2fe33420413

>docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
1cfcc8e8e7d4 aspnetcoreonlinux "dotnet WebApplicatio" 2 seconds ago Up 1 seconds 0.0.0.0:85->80/tcp clever_archimedes

And there's my ASP.NET Core app running in Docker. So I'm running Windows, running Hyper-V, running a Linux host that is hosting Docker containers.

What else can I do?

ASP.NET Core in a Docker Windows Container running Windows Nano Server

There's Windows Server, there's Windows Server Core that removes the UI among other things and there's Windows Nano Server which gets Windows down to like hundreds of megs instead of many gigs. This means there's a lot of great choices depending on what you need for functionality and server density. Ship as little as possible.

Let me see if I can get ASP.NET Core running on Kestrel under Windows Nano Server. Certainly, since Nano is very capable, I could run IIS within the container and there's docs on that.

Michael Friis from Docker has a great blog post on building and running your first Docker Windows Server Container. With the new Docker for Windows you can just right click on it and switch between Linux and Windows Containers.

Docker switches between Mac and Windows easily

So now I'm using Docker with Windows Containers. You may not know that you likely already have Windows Containers! It was shipped inside Windows 10 Anniversary Edition. You can check for Containers in Features:

Add Containers in Windows 10

I'll change my Dockerfile to use the Windows Nano Server image. I can also control the ports that ASP.NET talks on if I like with an Environment Variable and Expose that within Docker.

FROM microsoft/dotnet:nanoserver

ENTRYPOINT ["dotnet", "WebApplication4.dll"]
ARG source=.
WORKDIR /app
ENV ASPNETCORE_URLS http://+:82
EXPOSE 82
COPY $source .

Then I'll publish and build...

>dotnet publish

>docker build bin\Debug\netcoreapp1.0\publish -t aspnetcoreonnano

Then I'll run it, mapping the ports from Windows outside to the Windows container inside!

NOTE: There's a bug as of this writing that affects how Windows 10 talks to Containers via "NAT" (Network Address Translation) such that you can't easily go http://localhost:82 like you (and I) want to. Today you have to hit the IP of the container directly. I'll report back once I hear more about this bug and how it gets fixed. It'll show up in Windows Update one day. The workaround is to get the IP address of the container from docker like this:  docker inspect -f "{{ .NetworkSettings.Networks.nat.IPAddress }}" HASH

So I'll run my ASP.NET Core app on Windows Nano Server (again, to be clear, this is running on Windows 10 and Nano Server is inside a Container!)

>docker run -it -d -p 88:82 aspnetcoreonnano

afafdbead8b04205841a81d974545f033dcc9ba7f761ff7e6cc0ec8f3ecce215

>docker inspect -f "{{ .NetworkSettings.Networks.nat.IPAddress }}" afa
172.16.240.197

Now I can hit that site with 172.16.240.197:82. Once that bug above is fixed, it'll get hit and routed like any container.

The best part about Windows Containers is that they are fast and lightweight. Once the image is downloaded and build on your machine, you're starting and stopping them in seconds with Docker.

BUT, you can also isolate Windows Containers using Docker like this:

docker run --isolation=hyperv -it -d -p 86:82 aspnetcoreonnano

So now this instance is running fully isolated within Hyper-V itself. You get the best of all worlds. Speed and convenient deployment plus optional and easy isolation.

ASP.NET Core in a Docker Windows Container running Windows Server Core 2016

I can then change the Dockerfile to use the full Windows Server Core image. This is 8 gigs so be ready as it'll take a bit to download and extract but it is really Windows. You can also choose to run this as a container or as an isolated Hyper-V container.

Here I just change the FROM to get a Windows Sever Core with .NET Core included.

FROM microsoft/dotnet:1.0.0-preview2-windowsservercore-sdk

ENTRYPOINT ["dotnet", "WebApplication4.dll"]
ARG source=.
WORKDIR /app
ENV ASPNETCORE_URLS http://+:82
EXPOSE 82
COPY $source .

NOTE: I hear it's likely that the the .NET Core on Windows Server Core images will likely go away. It makes more sense for .NET Core to run on Windows Nano Server or other lightweight images. You'll use Server Core for heavier stuff. If you REALLY want to have .NET Core on Server Core you can make your own Dockerfile and easily build and image that has the things you want.

Then I'll publish, build, and run again.

>docker images

REPOSITORY TAG IMAGE ID CREATED SIZE
aspnetcoreonnano latest 7e02d6800acf 24 minutes ago 1.113 GB
aspnetcoreonservercore latest a11d9a9ba0c2 28 minutes ago 7.751 GB

Since containers are so fast to start and stop I can have a complete web farm running with Redis in a Container, SQL in another, and my web stack in a third. Or mix and match.

>docker ps

CONTAINER ID IMAGE COMMAND PORTS NAMES
d32a981ceabb aspnetcoreonwindows "dotnet WebApplicatio" 0.0.0.0:87->82/tcp compassionate_blackwell
a179a48ca9f6 aspnetcoreonnano "dotnet WebApplicatio" 0.0.0.0:86->82/tcp determined_stallman
170a8afa1b8b aspnetcoreonnano "dotnet WebApplicatio" 0.0.0.0:89->82/tcp agitated_northcutt
afafdbead8b0 aspnetcoreonnano "dotnet WebApplicatio" 0.0.0.0:88->82/tcp naughty_ramanujan
2cf45ea2f008 a7fa77b6f1d4 "dotnet WebApplicatio" 0.0.0.0:97->82/tcp sleepy_hodgkin

Conclusion

Again, go check out Michael's article where he uses Docker Compose to bring up the ASP.NET Music Store sample with SQL Express in one Windows Container and ASP.NET Core in another as well as Steve Lasker's blog (in fact his whole blog is gold) on making optimized Docker images with ASP.NET Core.

IMAGE ID            RESPOSITORY                   TAG                 SIZE

0ec4274c5571 web optimized 276.2 MB
f9f196304c95 web single 583.8 MB
f450043e0a44 microsoft/aspnetcore 1.0.1 266.7 MB
706045865622 microsoft/aspnetcore-build 1.0.1 896.6 MB

Steve points out a number of techniques that will allow you to get the most out of Docker and ASP.NET Core.

The result of all this means (IMHO) that you can use ASP.NET Core:

  • ASP.NET Core on Linux
    • within Docker containers
    • in any Cloud
  • ASP.NET Core on Windows, Windows Server, Server Core, and Nano Server.
    • within Docker windows containers
    • within Docker isolated Hyper-V containers

This means you can choose the level of feature support and size to optimize for server density and convenience. Once all the tooling (the Docker folks with Docker for Windows and the VS folks with Visual Studio Docker Tools) is baked, we'll have nice debugging and workflows from dev to production.

What have you been doing with Docker, Containers, and ASP.NET Core? Sound off in the comments.


Sponsor: Thanks to Redgate this week! Discover the world’s most trusted SQL Server comparison tool. Enjoy a free trial of SQL Compare, the industry standard for comparing and deploying SQL Server schemas.



© 2016 Scott Hanselman. All rights reserved.
     

Learning Arduino the fun way - Writing Games with Arduboy

$
0
0

IMG_1666My kids and I are always tinkering with gadgets and electronics. If you follow me on Instagram you'll notice our adventures as we've built a small Raspberry Pi powered arcade, explored retro-tech, built tiny robots, 3D printed a GameBoy (PiGrrl, in fact), and lots more.

While we've done a bunch of small projects with Arduinos, it's fair to say that there's a bit of a gap when one is getting started with Arduino. Arduinos aren't like Raspberry PIs. They don't typically have a screen or boot to a desktop. They are amazing, to be sure, but not everyone lights up when faced with a breadboard and a bunch of wires.

The Arduboy is a tiny, inexpensive hardware development platform based on Arduino. It's like a GameBoy that has an Arduino at its heart. It comes exactly as you see in the picture to the right. It uses a micro-USB cable (included) and has buttons, a very bright black and white OLED screen, and a speaker. Be aware, it's SMALL. Smaller than a GameBoy. This is a game that will fit in an 8 year old's pocket. It's definitely-fun sized and kid-sized. I could fit a half-dozen in my pocket.

The quick start for the Arduboy is quite clear. My 8 year old and I were able to get Hello World running in about 10 minutes. Just follow the guide and be sure to paste in the custom Board Manager URL to enable support in the IDE for "Arduboy."

The Arduboy is just like any other Arduino in that it shows up as a COM port on your Windows machine. You use the same free Arduino IDE to program it, and you utilize the very convenient Arduboy libraries to access sound, draw on the screen, and interact with the buttons.

To be clear, I have no relationship with the Arduboy folks, I just think it's a killer product. You can order an Arduboy for US$49 directly from their website. It's available in 5 colors and has these specs:

Specs

  • Processor: ATmega32u4 (same as Arduino Leonardo & Micro)
  • Memory: 32KB Flash, 2.5KB RAM, 1KB EEPROM
  • Connectivity: USB 2.0 w/ built in HID profile
  • Inputs: 6 momentary tactile buttons
  • Outputs: 128x64 1Bit OLED, 2 Ch. Piezo Speaker & Blinky LED
  • Battery: 180 mAh Thin-Film Lithium Polymer
  • Programming: Arduino IDE, GCC & AVRDude

There's also a friendly little app called Arduboy Manager that connects to an online repository of nearly 50 games and quickly installs them. This proved easier for my 8 year old than downloading the source, compiling, and uploading each time he wanted to try a new game.

The best part about Arduboy is its growing community. There's dozens if not hundreds of people learning how to program and creating games. Even if you don't want to program one, the list of fun games is growing every day.

The games are all open source and you can read the code while you play them. As an example, there's a little game called CrazyKart and the author says it's their first game! The code is on GitHub. Just clone it (or download a zip file) and open the .ino file into your Arduino IDE.

Arduboys are easy to program

Compile and upload the app while the Arduboy is connected to your computer. The Arduboy holds just one game at a time. Here's Krazy Kart as a gif:

Because the Arduboy is so constrained, it's a nice foray into game development for little ones - or any one. The screen is just 128x64 and most games use sprites consisting of 1 bit (just black or white). The Arduboy library is, of course, also open source and includes the primitives that most games will need, as well as lots of examples. You can draw bitmaps, swap frames, draw shapes, and draw characters.

We've found the Arduboy to be an ideal on ramp for the kids to make little games and learn basic programming. It's a bonus that they can easily take their games with them and share with their friends.

Related Links


Sponsor: Thanks to Redgate this week! Discover the world’s most trusted SQL Server comparison tool. Enjoy a free trial of SQL Compare, the industry standard for comparing and deploying SQL Server schemas.



© 2016 Scott Hanselman. All rights reserved.
     
Viewing all 1148 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>