<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/">
  <channel>
    <title>Python on Jakob Johnson</title>
    <link>https://jakobj.dev/tags/python/</link>
    <description>Recent content in Python on Jakob Johnson</description>
    <generator>Hugo -- 0.147.2</generator>
    <language>en-us</language>
    <lastBuildDate>Wed, 16 Nov 2022 11:22:47 -0700</lastBuildDate>
    <atom:link href="https://jakobj.dev/tags/python/index.xml" rel="self" type="application/rss+xml" />
    <item>
      <title>Pytorch Tutorial</title>
      <link>https://jakobj.dev/posts/pytorch-tutorial/</link>
      <pubDate>Wed, 16 Nov 2022 11:22:47 -0700</pubDate>
      <guid>https://jakobj.dev/posts/pytorch-tutorial/</guid>
      <description>&lt;p&gt;Most of the content here is from the &lt;a href=&#34;https://pytorch.org/tutorials/beginner/basics/intro.html&#34;&gt;Official Pytorch tutorial&lt;/a&gt;. I made this to be more concise and to present to a class.&lt;/p&gt;
&lt;p&gt;You can get the &lt;a href=&#34;https://github.com/jakobottar/pytorch-tutorial&#34;&gt;notebook here&lt;/a&gt;.&lt;/p&gt;
&lt;h2 id=&#34;pytorch-background&#34;&gt;PyTorch Background&lt;/h2&gt;
&lt;p&gt;Data in PyTorch is stored in Tensors, which are almost identical to NumPy arrays.&lt;/p&gt;
&lt;p&gt;Their key differences are&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Auto gradient calculation (with &lt;code&gt;torch.autograd&lt;/code&gt;)&lt;/li&gt;
&lt;li&gt;Ability to move to a GPU (with &lt;code&gt;Tensor.to(device)&lt;/code&gt;)&lt;/li&gt;
&lt;/ol&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;&#34;&gt;&lt;code class=&#34;language-python&#34; data-lang=&#34;python&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#f92672&#34;&gt;import&lt;/span&gt; torch
&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;data &lt;span style=&#34;color:#f92672&#34;&gt;=&lt;/span&gt; [[&lt;span style=&#34;color:#ae81ff&#34;&gt;1&lt;/span&gt;,&lt;span style=&#34;color:#ae81ff&#34;&gt;2&lt;/span&gt;,&lt;span style=&#34;color:#ae81ff&#34;&gt;3&lt;/span&gt;], [&lt;span style=&#34;color:#ae81ff&#34;&gt;4&lt;/span&gt;,&lt;span style=&#34;color:#ae81ff&#34;&gt;5&lt;/span&gt;,&lt;span style=&#34;color:#ae81ff&#34;&gt;6&lt;/span&gt;], [&lt;span style=&#34;color:#ae81ff&#34;&gt;7&lt;/span&gt;,&lt;span style=&#34;color:#ae81ff&#34;&gt;8&lt;/span&gt;,&lt;span style=&#34;color:#ae81ff&#34;&gt;9&lt;/span&gt;]]
&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;data_tensor &lt;span style=&#34;color:#f92672&#34;&gt;=&lt;/span&gt; torch&lt;span style=&#34;color:#f92672&#34;&gt;.&lt;/span&gt;tensor(data)
&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;print(data_tensor)
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;pre&gt;&lt;code&gt;tensor([[1, 2, 3],
        [4, 5, 6],
        [7, 8, 9]])
&lt;/code&gt;&lt;/pre&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;&#34;&gt;&lt;code class=&#34;language-python&#34; data-lang=&#34;python&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;ones_tensor &lt;span style=&#34;color:#f92672&#34;&gt;=&lt;/span&gt; torch&lt;span style=&#34;color:#f92672&#34;&gt;.&lt;/span&gt;ones(size&lt;span style=&#34;color:#f92672&#34;&gt;=&lt;/span&gt;data_tensor&lt;span style=&#34;color:#f92672&#34;&gt;.&lt;/span&gt;shape, dtype&lt;span style=&#34;color:#f92672&#34;&gt;=&lt;/span&gt;int)
&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;print(ones_tensor)
&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#75715e&#34;&gt;# these tensors behave almost exactly like numpy arrays&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;print(ones_tensor &lt;span style=&#34;color:#f92672&#34;&gt;@&lt;/span&gt; data_tensor)
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;pre&gt;&lt;code&gt;tensor([[1, 1, 1],
        [1, 1, 1],
        [1, 1, 1]])
tensor([[12, 15, 18],
        [12, 15, 18],
        [12, 15, 18]])
&lt;/code&gt;&lt;/pre&gt;
&lt;h2 id=&#34;datasets--dataloaders&#34;&gt;Datasets &amp;amp; DataLoaders&lt;/h2&gt;
&lt;p&gt;Some datasets are available from Pytorch&amp;rsquo;s own libraries, such as MNIST or Fashion-MNIST&lt;/p&gt;</description>
    </item>
    <item>
      <title>Python Management with Pyenv</title>
      <link>https://jakobj.dev/posts/pyenv/</link>
      <pubDate>Wed, 17 Aug 2022 00:00:00 +0000</pubDate>
      <guid>https://jakobj.dev/posts/pyenv/</guid>
      <description>&lt;h2 id=&#34;why-use-pyenv&#34;&gt;Why use Pyenv?&lt;/h2&gt;
&lt;p&gt;Python virtual environments have been a useful way of managing python packages and package versions for quite a while. With vanilla Python, &lt;code&gt;virtualenv&lt;/code&gt; is available, and for more complex cases Anaconda is a popular choice. Using these keeps your system installation of Python free of unnessecary clutter and packages as well as making it really easy to share dependencies with &lt;code&gt;pip freeze&lt;/code&gt;.&lt;/p&gt;
&lt;p&gt;But what if some of the packages you wanted to install aren&amp;rsquo;t avaliable for your system&amp;rsquo;s installation of Python? Or what if your system is stuck on an old version of Python and you want to use the brand new shiny Python 3.13? That&amp;rsquo;s where &lt;a href=&#34;https://github.com/pyenv/pyenv&#34;&gt;pyenv&lt;/a&gt; comes in. It&amp;rsquo;s a series of shell &amp;lsquo;shims&amp;rsquo; that seamlessly swap to different installations of Python. It&amp;rsquo;s frequently updated which means new versions of Python get added regularly so you can always stay up to date. Even better, since it&amp;rsquo;s simply shell commands you can install it without root access on your work machines!&lt;/p&gt;</description>
    </item>
  </channel>
</rss>
