<?xml version="1.0" encoding="utf-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">

  <title><![CDATA[Ferran Salguero]]></title>
  <link href="http://ferransalguero.github.io/atom.xml" rel="self"/>
  <link href="http://ferransalguero.github.io/"/>
  <updated>2024-03-15T13:33:53+01:00</updated>
  <id>http://ferransalguero.github.io/</id>
  <author>
    <name><![CDATA[Ferran Salguero]]></name>
    
  </author>
  
  
  <entry>
    <title type="html"><![CDATA[Simple free personal unblocker in .NET]]></title>
    <link href="http://ferransalguero.github.io/blog/Simple-free-personal-unblocke-net/"/>
    <updated>2023-01-26T00:00:00+01:00</updated>
    <id>http://ferransalguero.github.io/blog/Simple-free-personal-unblocke-net</id>
    <content type="html"><![CDATA[<p>Been quite some time since I posted anything but last month Appharbor stopped giving service, it was a simple cloud application service for .NET applications with a very good free tier, which I used extensively for testing, trying new tech and learn with personal projects. They were slow on the .NET core adoption and many of my sites were using the classic .NET framework.</p>

<p>I want to share a simple unblocker which I used for quite some time and <a href="https://github.com/FerranSalguero/Unblocker">published on Github recently</a>, if you have sites blocked by your ISP this is a simple and free alternative if you use the free tier in Azure for example.</p>

<p>You can navigate a site if you configure the destination url first using querystring:</p>

<p>Deploy and request the root with a querystring parameter q to set the proxy forwarding destination
Sample: <a href="https://yourpersonaldomain.com?q=https://github.com">https://yourpersonaldomain.com?q=https://github.com</a></p>

<p>Then navigate on the root or add the wanted path and the application will redirect you to github in this case</p>

<p>Samples:</p>

<ul>
  <li><a href="https://yourpersonaldomain.com">https://yourpersonaldomain.com</a></li>
  <li><a href="https://yourpersonaldomain.com/FerranSalguero/">https://yourpersonaldomain.com/FerranSalguero/</a></li>
</ul>
]]></content>
  </entry>
  
  
  
  <entry>
    <title type="html"><![CDATA[Using autoprefixer Sass or Coffeescript with Gulp, Vnext and Visual Studio 2015]]></title>
    <link href="http://ferransalguero.github.io/blog/Using-autoprefixer-sass-coffeescript-with-gulp-vnext-visual-studio-2015/"/>
    <updated>2015-08-26T00:00:00+02:00</updated>
    <id>http://ferransalguero.github.io/blog/Using-autoprefixer-sass-coffeescript-with-gulp-vnext-visual-studio-2015</id>
    <content type="html"><![CDATA[<p>Last month I wrote about <a href="/blog/Using-autoprefixer-in-Visual-Studio">using autoprefixer with Visual Studio</a> but as Visual Studio 2015 came out we found out that the extension that we used to execute autoprefixer (Web Essentials) was a bit different for Visual Studio 2015. So I decided to seize the moment to explore the new paths that ASP.Net5 wants to introduce, as we can see in the sample project, with tools widely used in the front end development community like gulp or bower.</p>

<p>Gulp, from their own description, is a tasks runner that works as a build system for the web ecosystem, from minification to compiling Sass or Coffeescript. To run those tasks you will need some plugins and a configuration file on the tasks to run.</p>

<p>In this <a href="https://github.com/FerranSalguero/AppHarborAspNet5Test/">simple example uploaded on Github</a> you can see how Gulp can be used to compile Sass and Coffeescript and use autoprefixer on the css generated. This was a test project to try some of these features on the new vNext paradigm and try to deploy it on <a href="https://appharbor.com/" rel="nofollow">AppHarbor</a>, one of my favourite hosting site as you may be already aware if you checked some of <a href="/portfolio">my projects</a>.</p>

<h1 id="notes">Notes</h1>

<ul>
  <li>Apart from checking if this AppHarbor supports ASP.NET 5 projects, I also want to see if it executes the Gulp tasks on build</li>
  <li>If you want to compile this project you need Visual Studio 2015, Community Edition is more than enough</li>
  <li>To test this project in your local IIS 7+ you must create a new virtual directory pointing to the projects wwwroot folder, the pool must use Integrated pipeline and framework v4.0</li>
</ul>

]]></content>
  </entry>
  
  
  
  <entry>
    <title type="html"><![CDATA[Using autoprefixer in Visual Studio]]></title>
    <link href="http://ferransalguero.github.io/blog/Using-autoprefixer-in-Visual-Studio/"/>
    <updated>2015-07-14T00:00:00+02:00</updated>
    <id>http://ferransalguero.github.io/blog/Using-autoprefixer-in-Visual-Studio</id>
    <content type="html"><![CDATA[<p>Autoprefixer helps completing your CSS stylesheets adding vendor specific prefixes to rules making your styles more compatible with more browsers, usually old versions.</p>

<p>Visual Studio does not support autoprefixer directly and you will have to use some kind of extension, I recommend <a href="http://vswebessentials.com/" rel="nofollow">Web Essentials</a>, but you will need at least Visual Studio 2013 to use autoprefixer. Web Essentials add also support to many web useful languages, like SCSS, Markdown or Coffeescript.</p>

<p>Once Web Essentials is installed you need to configure on the Tools -&gt; Options menu, the Web Essentials -&gt; CSS tab, set to ‘True’ the ‘Enable Autoprefixer’ option. Also you can configure which browsers do you want to target for specific prefixes, if you want a most compatible website I recommend the ‘&gt;0%’, browsers with any use. By default Autoprefixer uses the 2 last versions of the major browsers, you can check the <a href="https://github.com/ai/browserslist#queries" rel="nofollow">Browserslist docs</a> for extended reference.</p>

<p><img src="/images/posts/autoprefixer.jpg" alt="Autoprefixer config" /></p>

<p>[Update 2015-08-26]
Sadly on Visual Studio 2015 Web Essentials does not support autoprefixer, but we could <a href="/blog/Using-autoprefixer-sass-coffeescript-with-gulp-vnext-visual-studio-2015/">use Gulp to run a task that executes autoprefixer on our css files</a>.</p>

]]></content>
  </entry>
  
  
  
  <entry>
    <title type="html"><![CDATA[Recommended PC builds Summer 2015]]></title>
    <link href="http://ferransalguero.github.io/blog/Recommended-PC-builds-summer-2015/"/>
    <updated>2015-06-21T00:00:00+02:00</updated>
    <id>http://ferransalguero.github.io/blog/Recommended-PC-builds-summer-2015</id>
    <content type="html"><![CDATA[
]]></content>
  </entry>
  
  
  
  <entry>
    <title type="html"><![CDATA[Looking for a DDR3 1333Mhz Ripjaws 4GB module from Jun 2014]]></title>
    <link href="http://ferransalguero.github.io/blog/Looking-for-DDR3-1333-Ripjaws-module-Jun-2014/"/>
    <updated>2015-05-20T00:00:00+02:00</updated>
    <id>http://ferransalguero.github.io/blog/Looking-for-DDR3-1333-Ripjaws-module-Jun-2014</id>
    <content type="html"><![CDATA[<p>[Update 2015-09-20]<br />
Finally managed to get an old Ripjaws module and as expected my computer managed it seamlessly, now finally with 8gb of RAM. Thanks to all who helped out!</p>

<p>As my computer has already a few years it isn’t compatible with some newer models of this same memory, I tested it myself, I got already a module fabricated on Jun 2014 that works fine but a few weeks back I bought one on a local store from Oct 2014 that didn’t work on my computer and I had to return it.</p>

<p>You can check the production date on the module warranty sticker.</p>

<p>As many have suggested this seems due to the chipset distribution configuration, on newer models it seems that manufacturers are using high density chipsets of at least 512MB, for this reason some old motherboards are unable to use this modern chipsets and bound to use module with 256MB chipsets. Of course, using less chipsets is an advantage to reduce manufacturing costs.</p>

<p><img src="/images/posts/ripjaws.jpg" alt="Ripjaws module date" /></p>

<p>For this reason I am searching actively on Ebay for a <a href="http://wheretobuy.apphb.com/ripjaws%20ddr3">Ripjaws DDR3</a> module for a model of similar dates of Jun 2014 or with 256MB chipsets, if you are a store or a particular and happen to have one model and willing to send to Spain please let me know. Thanks in advance!</p>
]]></content>
  </entry>
  
  
  
  <entry>
    <title type="html"><![CDATA[Internet Explorer version testing with VMs and Vagrant]]></title>
    <link href="http://ferransalguero.github.io/blog/Internet-Explorer-version-testing-with-vm-vagrant/"/>
    <updated>2015-05-01T00:00:00+02:00</updated>
    <id>http://ferransalguero.github.io/blog/Internet-Explorer-version-testing-with-vm-vagrant</id>
    <content type="html"><![CDATA[<p>To simplify testing with different versions of Internet Explorer and due to Explorer’s tight coupling with the operating system version, we need to test this different versions on virtual machines that contain the desired version of Explorer. Thanks to <a href="https://www.modern.ie/" rel="nofollow">modern.ie</a> achieving this is a simple task as they provide already configured virtual machines with the different versions of Explorer.</p>

<p>But it’s not perfect as it seems, of course, the OS on these virtual machines is not free so it will ask for activation after a few weeks after starting the virtual machine, you will be able to extend this period the so called re-arm procedure described in the desktop wallpaper of these VMs, but with a limitation of a couple of re-arms. Then you will need to redownload the original VM and configure it again.</p>

<p>To avoid this there’s a possibility to simplify this process doing a backup of the VM before rearming, so you will have an expired VM that will need rearming but configured to reuse from time to time.</p>

<p>With Vagrant we can easily manage this procedure, <a href="https://www.vagrantup.com/" rel="nofollow">Vagrant</a>, allows you to easily start a virtual machine from a packaged version file. This file can be stored remotely so Vagrant will download the package and install the VM in your local Virtual Box and start it. Vagrant works better with Virtual Box so we will describe the configuration process to work with it.</p>

<h2 id="necessary-software">Necessary software</h2>

<ul>
  <li><a href="https://www.virtualbox.org/wiki/Downloads" rel="nofollow">Virtual Box</a></li>
  <li><a href="https://www.vagrantup.com/downloads.html" rel="nofollow">Vagrant</a></li>
</ul>

<h2 id="preparing-packaged-boxes">Preparing packaged boxes</h2>

<p>Download Vagrant boxes provided by <a href="http://blog.syntaxc4.net/post/2014/09/03/windows-boxes-for-vagrant-courtesy-of-modern-ie.aspx" rel="nofollow">modern.ie</a> and install following <a href="https://gist.github.com/andreptb/57e388df5e881937e62a" rel="nofollow">this method</a>, configure them to your necessities (proxy settings, updates, needed software, etc.) and then you will be able to repack with:</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>vagrant package --output "yourboxname"
</code></pre></div></div>

<p>We will not repackage the Vagrant file, as we will save it in our repository to easy setup on our team.</p>

<p>Now you can share the packaged box to a shared directory, and, of course, change the box url to the shared folder in the Vagrantfile.</p>

<p>Now you can upload these Vagrantfiles for each configuration to your source code repository or shared folder.</p>

<h2 id="resources">Resources</h2>

<ul>
  <li><a href="https://gist.github.com/andreptb/57e388df5e881937e62a">Step by step modern.ie box configuration</a></li>
  <li><a href="http://www.emoxter.com/welcome-to-ghost/">Step by step WinXP box configuration</a></li>
  <li><a href="http://blog.syntaxc4.net/post/2014/09/03/windows-boxes-for-vagrant-courtesy-of-modern-ie.aspx">Virtual boxes from modern.ie</a></li>
</ul>

]]></content>
  </entry>
  
  
  
  <entry>
    <title type="html"><![CDATA[Chrome hangs internet connection with Atheros LAN chips]]></title>
    <link href="http://ferransalguero.github.io/blog/Chrome-hangs-Internet-connection-with-Atheros-LAN-chips/"/>
    <updated>2013-06-26T00:00:00+02:00</updated>
    <id>http://ferransalguero.github.io/blog/Chrome-hangs-Internet-connection-with-Atheros-LAN-chips</id>
    <content type="html"><![CDATA[<p>From time to tine I found that Chrome hangs my internet connection on typical tasks like uploading an image, I had resolved this problem a few years ago and suddenly I found the same problem after installing Windows 8.</p>

<p>As it turns out, the drivers for the <em>Atheros L1 Gigabit Ethernet</em> controller, when using <strong>Windows 7</strong> or, as I found out recently, in <strong>Windows 8</strong>, are unable to support task offloading, resulting in disconnecting current internet connection which can only be solved disabling and re-enabling the connection.</p>

<p><strong>To fix this problem you need to change the task offloading property to ‘off’ <em>Device manager</em>&gt;<em>network adapter</em>&gt;<em>select_your_Atheros_adapter</em>&gt;<em>properties</em>&gt;<em>advanced</em>&gt;<em>task offload:value=off</em></strong>.</p>

<p>Check this screenshot in a perfect english-spanish combination :)</p>

<p><img src="/images/posts/atheros.jpg" alt="Atheros task offload property" /></p>
]]></content>
  </entry>
  
  
  
  <entry>
    <title type="html"><![CDATA[Powershell recursive empty folder deletion script]]></title>
    <link href="http://ferransalguero.github.io/blog/Powershell-recursive-deletion-script/"/>
    <updated>2012-11-02T00:00:00+01:00</updated>
    <id>http://ferransalguero.github.io/blog/Powershell-recursive-deletion-script</id>
    <content type="html"><![CDATA[<p>I’ve been using <em>powershell</em> to <strong>automatize some tedious tasks</strong> done through visual interface, this is a powerful tool and I recommend everybody with some programmatic skills look into to it. The last task has been removing empty directories in hierarchy, so the deepest directory being empty will be deleted, the parent, when all children directories are deleted for being empty will be deleted if it’s empty, that’s the main idea.</p>

<p>I’ve taken an already <a href="http://guyellisrocks.com/powershell/powershell-script-to-remove-empty-directories/" rel="nofollow">posted script</a> by Guy Ellis on his blog to delete empty folders and taken one step further, I’ve created a recursive function for this deletion that is being called before the check of the content of the directory.</p>

<p>The function is this:
<script src="https://gist.github.com/ferransalguero/5217602.js">
</script></p>

<p>I’ve found some problem in the second get-childitem for some dir-names with strange character, probably can be changed to some other cmdlet to know if the current dir is empty. I will check it as soon as I can.</p>

]]></content>
  </entry>
  
  
</feed>