HTML preprocessors can make writing HTML more powerful or convenient. For instance, Markdown is designed to be easier to write and read for text documents and you could write a loop in Pug.
In CodePen, whatever you write in the HTML editor is what goes within the <body>
tags in a basic HTML5 template. So you don't have access to higher-up elements like the <html>
tag. If you want to add classes there that can affect the whole document, this is the place to do it.
In CodePen, whatever you write in the HTML editor is what goes within the <body>
tags in a basic HTML5 template. If you need things in the <head>
of the document, put that code here.
The resource you are linking to is using the 'http' protocol, which may not work when the browser is using https.
CSS preprocessors help make authoring CSS easier. All of them offer things like variables and mixins to provide convenient abstractions.
It's a common practice to apply CSS to a page that styles elements such that they are consistent across all browsers. We offer two of the most popular choices: normalize.css and a reset. Or, choose Neither and nothing will be applied.
To get the best cross-browser support, it is a common practice to apply vendor prefixes to CSS properties and values that require them to work. For instance -webkit-
or -moz-
.
We offer two popular choices: Autoprefixer (which processes your CSS server-side) and -prefix-free (which applies prefixes via a script, client-side).
Any URLs added here will be added as <link>
s in order, and before the CSS in the editor. You can use the CSS from another Pen by using its URL and the proper URL extension.
You can apply CSS to your Pen from any stylesheet on the web. Just put a URL to it here and we'll apply it, in the order you have them, before the CSS in the Pen itself.
You can also link to another Pen here (use the .css
URL Extension) and we'll pull the CSS from that Pen and include it. If it's using a matching preprocessor, use the appropriate URL Extension and we'll combine the code before preprocessing, so you can use the linked Pen as a true dependency.
JavaScript preprocessors can help make authoring JavaScript easier and more convenient.
Babel includes JSX processing.
Any URL's added here will be added as <script>
s in order, and run before the JavaScript in the editor. You can use the URL of any other Pen and it will include the JavaScript from that Pen.
You can apply a script from anywhere on the web to your Pen. Just put a URL to it here and we'll add it, in the order you have them, before the JavaScript in the Pen itself.
If the script you link to has the file extension of a preprocessor, we'll attempt to process it before applying.
You can also link to another Pen here, and we'll pull the JavaScript from that Pen and include it. If it's using a matching preprocessor, we'll combine the code before preprocessing, so you can use the linked Pen as a true dependency.
Search for and use JavaScript packages from npm here. By selecting a package, an import
statement will be added to the top of the JavaScript editor for this package.
Using packages here is powered by esm.sh, which makes packages from npm not only available on a CDN, but prepares them for native JavaScript ESM usage.
All packages are different, so refer to their docs for how they work.
If you're using React / ReactDOM, make sure to turn on Babel for the JSX processing.
If active, Pens will autosave every 30 seconds after being saved once.
If enabled, the preview panel updates automatically as you code. If disabled, use the "Run" button to update.
If enabled, your code will be formatted when you actively save your Pen. Note: your code becomes un-folded during formatting.
Visit your global Editor Settings.
<canvas id="responsive-canvas"></canvas>
<div class="centringContainer">
<div class="table">
<div class="row">
<div class="left">
<label for="FreqSlider"><strong>Frequency:</strong></label>
</div>
<div class="slidecontainer">
<input type="range" min="2" max="16" step="0.01" value="2" class="slider" id="FreqSlider"/>
</div>
</div>
</div>
<div class="main">
<div class="full">
<p>
This is a demonstration of aliasing in a digital system. Try turning on the <strong>View aliasing</strong> toggle and watch what happens to the orange signal when you increase the frequency of the blue signal.
</p>
</div>
<div class="table">
<div class="row">
<div class="left">
<label for="AliasedToggle"><strong>View aliasing:</strong></label>
</div>
<div class="togglecontainer">
<label class="switch">
<input type="checkbox" id="AliasedToggle">
<span class="toggleSlide" id="orangeBack"></span>
</label>
</div>
</div>
</div>
<div class="full">
<p>
The orange signal represents our attempt to reproduce the analogue (blue) signal given purely digital data.
</p>
<p>
We are only able to convert analogue signals into digital data by <em>sampling</em> them at regular intervals.
</p>
<p>
<strong>View samples</strong> visualises this digital data. Can you see why the orange signal does not follow the original blue signal?
</p>
</div>
<div class="table">
<div class="row">
<div class="left">
<label for="SamplesToggle"><strong>View samples:</strong></label>
</div>
<div class="togglecontainer">
<label class="switch">
<input type="checkbox" id="SamplesToggle">
<span class="toggleSlide"></span>
</label>
</div>
</div>
</div>
<div class="full">
<p>
This effect is called <strong>aliasing</strong>. Because the frequency of our input signal is greater than <em>half</em> of our digital samplerate, we are unable to reproduce these higher frequencies accurately.
</p>
<p>
Turn on audio to hear what aliased digital audio sounds like (tested in Chrome and Firefox 76).
</p>
</div>
<div class="table">
<div class="row">
<div class="left">
<label for="AudioToggle"><strong>Toggle audio:</strong></label>
</div>
<div class="togglecontainer">
<button type="button" onclick="audioToggle()" class="button" id="AudioToggle">
<span class="material-icons">volume_off</span>
</button>
</div>
</div>
</div>
<div class="full">
<p>
<br/>
<em><strong>Note:</strong> This example intentionally uses a less than perfect interpolation to emphasize the difficulties involved in reproducing an analogue signal in a digital system.</em>
</p>
<p>
<em>This gives rise to additional aliasing artifacts close to the <a href="https://en.wikipedia.org/wiki/Nyquist_frequency">nyquist frequency</a>; you may hear 2 separate tones moving in opposite directions at certain frequency settings.</em>
</p>
</div>
</div>
</div>
@import url('https://fonts.googleapis.com/icon?family=Material+Icons');
@import url('https://fonts.googleapis.com/css?family=Open+Sans&display=swap');
@import url('https://fonts.googleapis.com/css2?family=Spartan:wght@900&display=swap');
html {
height: 100%;
margin: 0;
padding: 0;
}
body {
font-family: 'Open Sans', sans-serif;
font-size: large;
color: #44546a;
height: 100%;
margin: 0;
padding: 0;
overflow: hidden;
}
a {
font-weight: bold;
/*text-decoration: none;*/
color: #44546a;
}
a:hover {
color: #dd7d3b;
}
.centringContainer {
width: 96%;
max-width: 512px;
height: 100%;
margin: auto;
position: relative;
top: 16px;
}
.main {
height: calc(100% - 320px);
overflow-y: auto;
padding-left: 6px;
padding-right: 6px;
}
.full {
width: 100%;
margin: auto;
/*text-align: center;*/
margin-bottom: 0px;
}
.table {
display: table;
width: 100%;
}
.row {
display: table-row;
height: 24px;
}
.left {
min-width: 96px;
height: 32px;
text-align: right;
display: table-cell;
vertical-align: middle;
/*border: 1px solid black;*/
font-family: 'Spartan', sans-serif;
}
.slidecontainer {
width: 90%;
height: 32px;
display: table-cell;
/*border: 1px solid black;*/
vertical-align: middle;
padding-left: 6px;
}
.togglecontainer {
width: 50%;
height: 32px;
display: table-cell;
/*border: 1px solid black;*/
vertical-align: middle;
padding-left: 6px;
}
.material-icons {
font-family: 'Material Icons';
font-weight: normal;
font-style: normal;
font-size: 32px; /* Preferred icon size */
display: inline-block;
line-height: 1;
text-transform: none;
letter-spacing: normal;
word-wrap: normal;
white-space: nowrap;
direction: ltr;
/* Support for all WebKit browsers. */
-webkit-font-smoothing: antialiased;
/* Support for Safari and Chrome. */
text-rendering: optimizeLegibility;
/* Support for Firefox. */
-moz-osx-font-smoothing: grayscale;
/* Support for IE. */
font-feature-settings: 'liga';
}
.button {
background: #FFFFFF;
cursor: pointer;
/*border: 4px solid #8496b0;
border-radius: 50%;*/
border: 0px;
color: #44546a;
width: 32px;
height: 32px;
}
.button:hover {
color: #8496b0;
}
.slider {
-webkit-appearance: none; /* Override default CSS styles */
appearance: none;
width: 100%; /* Full-width */
height: 16px; /* Specified height */
outline: none; /* Remove outline */
background: #FFFFFF;
}
/* The slider handle (use -webkit- (Chrome, Opera, Safari, Edge) and -moz- (Firefox) to override default look) */
.slider::-webkit-slider-thumb {
-webkit-appearance: none; /* Override default look */
appearance: none;
width: 16px; /* Set a specific slider handle width */
height: 16px; /* Slider handle height */
background: #FFFFFF;
cursor: pointer; /* Cursor on hover */
margin-top: -6px;
border: 2px solid #8496b0;
border-radius: 8px;
}
.slider::-moz-range-thumb {
width: 16px; /* Set a specific slider handle width */
height: 100%; /* Slider handle height */
background: #FFFFFF;
cursor: pointer; /* Cursor on hover */
border-width: 2px;
border-color: #8496b0;
border-radius: 50%;
}
.slider::-webkit-slider-thumb:hover {
background: #EFEFFF;
}
.slider::-moz-range-thumb:hover {
background: #EFEFFF;
}
input[type=range]::-webkit-slider-runnable-track {
width: 100%;
height: 4px;
cursor: pointer;
background: #8496b0;
border-radius: 2px;
}
input[type=range]::-moz-range-track {
width: 100%;
height: 4px;
cursor: pointer;
background: #8496b0;
border-radius: 2px;
}
#responsive-canvas {
width: 100%; /*Necessary to ensure our canvas fits horizontally.*/
}
/* The switch - the box around the slider */
.switch {
position: relative;
display: inline-block;
width: 48px;
height: 24px;
margin-left: 2px;
/*margin-top: 2px;
margin-bottom: 2px;*/
}
/* Hide default HTML checkbox */
.switch input {
opacity: 0;
width: 0;
height: 0;
}
/* The toggle slider */
.toggleSlide {
position: absolute;
cursor: pointer;
top: 0;
left: 0;
right: 0;
bottom: 0;
background-color: #ccc;
-webkit-transition: .4s;
transition: .4s;
border-radius: 24px;
}
.toggleSlide:before {
position: absolute;
content: "";
width: 18px;
height: 18px;
left: 4px;
bottom: 3.5px;
background-color: white;
-webkit-transition: .4s;
transition: .4s;
border-radius: 24px;
}
input:checked + .toggleSlide {
background-color: #44546a;
}
input:checked + #orangeBack {
background-color: #dd7d3b;
}
/*input:focus + .toggleSlide {
box-shadow: 0 0 1px #8496b0;
}*/
input:checked + .toggleSlide:before {
-webkit-transform: translateX(22px);
-ms-transform: translateX(22px);
transform: translateX(22px);
}
var canvas;
var freqSlider;
var samplesCheckbox;
var aliasedCheckbox;
var frequency = 2.0;
//True if audio is running.
var audioRunning = false;
var aliasingGenNode;
window.addEventListener('load', (event) => {
//Get canvas for setting internal resolution.
canvas = document.getElementById('responsive-canvas');
//Get our on-screen controls.
freqSlider = document.getElementById("FreqSlider");
samplesCheckbox = document.getElementById("SamplesToggle");
aliasedCheckbox = document.getElementById("AliasedToggle");
freqSlider.oninput = freqSliderChange;
samplesCheckbox.onchange = samplesCheckboxChange;
aliasedCheckbox.onchange = aliasedCheckboxChange;
window.onresize = resize;
resize();
draw();
});
//Helper function for getting a URL for a string/code.
function getURLFromInlineCode(code) {
const codeBlob = new Blob(code, {type: 'application/javascript'});
return URL.createObjectURL(codeBlob);
}
//Our custom audio processor, encapsulated in a URL.
const aliasingGenURL = getURLFromInlineCode`
class AliasingGenerator extends AudioWorkletProcessor {
static get parameterDescriptors () {
return [{
name: 'frequency',
defaultValue: 1,
minValue: 1,
maxValue: 8,
automationRate: 'a-rate'
},
{
name: 'amplitude',
defaultValue: 1,
minValue: 0,
maxValue: 1,
automationRate: 'a-rate'
}]
}
constructor() {
super();
this.sinIndex = 0.0;
/*this.frequency = 1.0;
this.amplitude = 0.0;*/
this.sinCounter = 0.0;
this.sinVal = [0, 0, 0, 0];
//Used to fake the nyquist cancellation.
this.ampVal = [0, 0, 0, 0];
//Our artificially-lowered samplerate.
this.fakeSamplerate = 1660.0;
}
//Generate our audio.
process(inputs, outputs, parameters) {
for(let i=0;i<outputs[0].length;++i) {
const buffer = outputs[0][i];
if(i == 0) {
for(let j=0;j<buffer.length;++j) {
this.sinCounter += this.fakeSamplerate/sampleRate;
var freq = (parameters['frequency'].length > 1) ? parameters['frequency'][i] : parameters['frequency'][0];
var amp = (parameters['amplitude'].length > 1) ? parameters['amplitude'][i] : parameters['amplitude'][0];
if(this.sinCounter >= 1.0)
{
this.sinVal[0] = this.sinVal[1];
this.sinVal[1] = this.sinVal[2];
this.sinVal[2] = this.sinVal[3];
this.sinVal[3] = Math.sin(this.sinIndex);
this.sinIndex += ((freq * 220.0)/this.fakeSamplerate) * 2.0 * Math.PI;
this.sinIndex %= (2.0 * Math.PI);
this.ampVal[0] = this.ampVal[1];
this.ampVal[1] = this.ampVal[2];
this.ampVal[2] = this.ampVal[3];
if((freq > 3.9) && (freq < 4.0))
this.ampVal[3] = 1.0 - ((freq - 3.9)/0.1);
else if((freq >= 4.0) && (freq < 4.1))
this.ampVal[3] = (freq - 4.0)/0.1;
else
this.ampVal[3] = 1.0;
this.sinCounter -= 1.0;
}
//B-spline interpolation to get our aliased sine wave.
var ym1py1 = this.sinVal[0] + this.sinVal[2];
var c0 = (1.0/6.0) * ym1py1 + (2.0/3.0) * this.sinVal[1];
var c1 = (1.0/2.0) * (this.sinVal[2]-this.sinVal[0]);
var c2 = (1.0/2.0) * ym1py1 - this.sinVal[1];
var c3 = (1.0/2.0) * (this.sinVal[1]-this.sinVal[2]) + (1.0/6.0) * (this.sinVal[3]-this.sinVal[0]);
buffer[j] = ((c3*this.sinCounter+c2)*this.sinCounter+c1)*this.sinCounter+c0;
buffer[j] *= amp;
//B-spline interpolation of our amplitude hitting zero at nyquist.
ym1py1 = this.ampVal[0] + this.ampVal[2];
c0 = (1.0/6.0) * ym1py1 + (2.0/3.0) * this.ampVal[1];
c1 = (1.0/2.0) * (this.ampVal[2]-this.ampVal[0]);
c2 = (1.0/2.0) * ym1py1 - this.ampVal[1];
c3 = (1.0/2.0) * (this.ampVal[1]-this.ampVal[2]) + (1.0/6.0) * (this.ampVal[3]-this.ampVal[0]);
buffer[j] *= ((c3*this.sinCounter+c2)*this.sinCounter+c1)*this.sinCounter+c0;
}
}
else {
//We're mono, but if we find ourselves working in stereo, just copy channel 0.
buffer[0][i] = buffer[0][0].slice();
}
}
return true;
}
}
registerProcessor('AliasingGen', AliasingGenerator);
`
//Node for our audio processor.
class AliasingGenNode extends AudioWorkletNode {
constructor(context) {
super(context, 'AliasingGen');
}
}
//Used to resize our canvas when the window's resized.
function resize() {
canvas.width = window.innerWidth * window.devicePixelRatio;
//console.log("window.innerWidth: ", window.innerWidth * window.devicePixelRatio);
//Set canvas height as a proportion of width.
/*var height = canvas.width * 0.5;
if(height > 320)
height = 320;
canvas.height = height;*/
canvas.height = 320;
draw();
}
//Used to redraw the canvas when necessary.
function draw() {
//Start drawing.
const canvasContext = canvas.getContext('2d');
const centreY = canvas.height/2;
const sinHeight = centreY - 8;
/*canvasContext.fillStyle = 'white';
canvasContext.fillRect(0, 0, canvas.width, canvas.height);*/
if(samplesCheckbox.checked)
canvasContext.strokeStyle = 'rgb(132, 150, 176)';
else
canvasContext.strokeStyle = 'rgb(68, 84, 106)';
canvasContext.lineWidth = 4;
var numPeriods = frequency;
var periodLength = (canvas.width/numPeriods);
var halfLength = (periodLength * 0.5);
canvasContext.translate(-(0.5 * halfLength), 0);
canvasContext.beginPath();
canvasContext.moveTo(0, centreY + sinHeight);
for(let i=0;i<(numPeriods + 1);++i) {
var x = (i * periodLength);
canvasContext.bezierCurveTo(x + (halfLength * 0.3634),
centreY + sinHeight,
x + (halfLength * 0.6366),
centreY - sinHeight,
x + halfLength,
centreY - sinHeight);
x += halfLength;
canvasContext.bezierCurveTo(x + (halfLength * 0.3634),
centreY - sinHeight,
x + (halfLength * 0.6366),
centreY + sinHeight,
x + halfLength,
centreY + sinHeight);
}
canvasContext.stroke();
canvasContext.translate((0.5 * halfLength), 0);
//Calculate samples.
var sampleDist = canvas.width/16.0;
var samples = new Array(19);
for(let i=0;i<19;++i) {
//Calculate y position.
samples[i] = centreY;
samples[i] -= Math.sin((i/16) * frequency * 2 * Math.PI) * sinHeight;
}
//Draw aliased line.
if(aliasedCheckbox.checked) {
let points = [[-sampleDist, centreY - Math.sin((-1/16) * frequency * 2 * Math.PI) * sinHeight], [0, samples[0]], [sampleDist, samples[1]], [sampleDist * 2, samples[2]]];
canvasContext.strokeStyle = 'rgb(221, 125, 59)';
canvasContext.beginPath();
canvasContext.moveTo(0, centreY);
var x = 0;
var xInc = canvas.width/1024;
var lastX = 0;
for(let i=1;i<1024;++i) {
var samplesX = (x/canvas.width) * 16;
var fraction = samplesX - Math.floor(samplesX);
samplesX = Math.floor(samplesX);
if(samplesX != lastX) {
//x-axis.
points[0][0] = sampleDist * (i-1);
points[1][0] = sampleDist * i;
points[2][0] = sampleDist * (i+1);
points[3][0] = sampleDist * (i+1);
//y-axis.
points[0][1] = points[1][1];
points[1][1] = points[2][1];
points[2][1] = points[3][1];
points[3][1] = samples[samplesX + 2];
lastX = samplesX;
}
var temp = thirdInterp(fraction, points[0][1], points[1][1], points[2][1], points[3][1]);
canvasContext.lineTo(x, temp);
x += xInc;
}
canvasContext.stroke();
}
//Draw samples.
if(samplesCheckbox.checked) {
var x = 0;
for(let i=0;i<17;++i) {
//Set draw colours.
canvasContext.fillStyle = 'rgb(255, 255, 255)';
canvasContext.strokeStyle = 'rgb(68, 84, 106)';
//Draw vertical line.
canvasContext.beginPath();
canvasContext.moveTo(x, centreY);
canvasContext.lineTo(x, samples[i]);
canvasContext.stroke();
//Draw sample circle.
canvasContext.beginPath();
canvasContext.arc(x, samples[i], 5, 0, 2 * Math.PI, false);
canvasContext.fill();
canvasContext.stroke();
x += sampleDist;
}
}
}
function freqSliderChange() {
frequency = freqSlider.value;
if((frequency > 8) && (frequency < 8.05)) {
frequency = 8;
freqSlider.value = frequency;
}
if(audioRunning) {
aliasingGenNode.parameters.get('frequency').linearRampToValueAtTime(frequency * 0.5, audioContext.currentTime + 0.05);
}
const canvasContext = canvas.getContext('2d');
canvasContext.clearRect(0, 0, canvas.width, canvas.height);
draw();
}
function samplesCheckboxChange() {
const canvasContext = canvas.getContext('2d');
canvasContext.clearRect(0, 0, canvas.width, canvas.height);
draw();
}
function aliasedCheckboxChange() {
const canvasContext = canvas.getContext('2d');
canvasContext.clearRect(0, 0, canvas.width, canvas.height);
draw();
}
//Used to toggle our audio on and off.
//Function must be async in order to use await on audioWorklet.addModule().
async function audioToggle() {
if(!audioRunning) {
//Create web audio api context.
audioContext = new (window.AudioContext || window.webkitAudioContext)();
//Add our custom audio processor class.
//addModule() returns a promise, so we have to await until it's fulfilled.
await audioContext.audioWorklet.addModule(aliasingGenURL);
//Create an instance of our custom audio processor class.
aliasingGenNode = new AliasingGenNode(audioContext);
aliasingGenNode.parameters.get('amplitude').setValueAtTime(0.0, audioContext.currentTime);
aliasingGenNode.parameters.get('amplitude').linearRampToValueAtTime(1.0, audioContext.currentTime + 0.25);
//Connect our nodes to the audio context.
aliasingGenNode.connect(audioContext.destination);
audioRunning = true;
//Update audio button.
document.getElementById("AudioToggle").innerHTML = "<span class='material-icons'>volume_up</span>";
}
else {
//Stops the audio, releases any audio resources used.
audioContext.close();
audioRunning = false;
//Update audio button.
document.getElementById("AudioToggle").innerHTML = "<span class='material-icons'>volume_off</span>";
}
}
//Interpolation function used in drawing the aliased line.
function thirdInterp(x, L1, L0, H0,H1) {
return L0 + 0.5 *
x*(H0-L1 +
x*(H0 + L0*(-2) + L1 +
x*( (H0 - L0)*9 + (L1 - H1)*3 +
x*((L0 - H0)*15 + (H1 - L1)*5 +
x*((H0 - L0)*6 + (L1 - H1)*2 )))));
}
Also see: Tab Triggers