Blog

  • PYNQ-CNN-ATTEMPT

    PYNQ-CNN-ATTEMPT

    These are some attempts I made during my undergraduate graduation project.

    The hardware platform I use is PYNQ-Z2.
    The PS part is an Arm CPU running Ubuntu 16.04 LTS, which supports Python.
    The PL part is the Zynq XC7Z020 FPGA.

    The version of Vivado and Vivado HLS is 2018.2.

    Any problems,please contact me.

    Digilent Vivado IP Library

    This is the open source IP library provided by Digilent for video processing. I mainly use its rgb2dvi to implement my HDMI video output module.

    HDMI VDMA Test

    This is the Vivado project of the HDMI video output test I built. The video data is output from the DDR memory through VDMA. Please see the Ultrasound Image Classification section for details.

    Mean Single Convolution

    This is the project I built to try the PYNQ development flow before implementing CNN, which realizes hardware acceleration for a single convolution operation.

    Minst CNN

    This is the project that implements the classification of the Minst dataset.

    Ultrasound Image Classification CNN

    This project achieved automatic classification of ultrasound images and it is my latest achievement currently. It can read ultrasound image data from the SD card for classification, then synthesize the resulting image and output it through the HDMI port. Due to privacy issues, I only uploaded a small number of images for testing.

    Visit original content creator repository
    https://github.com/ZhaoqxCN/PYNQ-CNN-ATTEMPT

  • ESP32_ePaper_Frame

    ESP32_ePaper_Frame

    Smart E-Paper frame controlled over wifi, using ESP32 and 7.5inch waveshare E-Ink display.

    Demo Image 1

    Requirements

    Libraries

    ESP32_ePaper_Frame uses ESPAsyncWebServer library,
    and AsyncTCP is a dependency for the ESPAsyncWebServer.

    Clone/download both libraries and save them to your Arduino libraries folder (usually located on C:\Users\%USERPROFILE%\Documents\Arduino\libraries)

    Hardware

    • ESP32 microcontroller
    • 7.5inch E-Ink display with driver board for ESP32 (Waveshare).

    Installation

    1. Open Smart_ePaper_Frame.ino file with Arduino.
    2. Upload the data folder using ESP32 Filesystem Uploader (Tutorial).
    3. Upload the code to your ESP32 board.

    Usage

    1. Configure your wifi credentials in the credentials.h file:

      WifiCredentials wifi_credentials = {
          "YOUR_WIFI_SSID",
          "YOUR_WIFI_PASSWORD"
      };
    2. Configure the SPI pins connected to the display driver board in the display.h file:

      #define PIN_SPI_SCK     18
      #define PIN_SPI_DIN     23
      #define PIN_SPI_CS      5
      #define PIN_SPI_BUSY    4
      #define PIN_SPI_RST     16
      #define PIN_SPI_DC      17
    3. After the program started, The IP of the board will be shown in the Serial Monitor:

      Connecting to WiFi..
      192.168.1.32
      
    4. Browse to the given IP in your browser and upload a photo.

    More Images:

    Demo Image 2

    Dithering

    Demo Image 3

    Visit original content creator repository https://github.com/dani3lwinter/ESP32_ePaper_Frame
  • windfall-website

    This is a Next.js project bootstrapped with create-next-app.

    Getting Started

    First, run the development server:

    npm run dev
    # or
    yarn dev
    # or
    pnpm dev

    Open http://localhost:3000 with your browser to see the result.

    You can start editing the page by modifying pages/index.tsx. The page auto-updates as you edit the file.

    API routes can be accessed on http://localhost:3000/api/hello. This endpoint can be edited in pages/api/hello.ts.

    The pages/api directory is mapped to /api/*. Files in this directory are treated as API routes instead of React pages.

    This project uses next/font to automatically optimize and load Inter, a custom Google Font.

    Learn More

    To learn more about Next.js, take a look at the following resources:

    You can check out the Next.js GitHub repository – your feedback and contributions are welcome!

    Deploy on Vercel

    The easiest way to deploy your Next.js app is to use the Vercel Platform from the creators of Next.js.

    Check out our Next.js deployment documentation for more details.

    Visit original content creator repository
    https://github.com/azurespheredev/windfall-website

  • next-graphql-apollo-contentful

    In progress: Next.js project using Typescript, Styled Components, Contentful’s GraphQL API and Apollo.

    Static Generation

    The portfolio and its contents are statically pre-rendered and revalidated every 24 hours.
    To do so, the following steps are being followed:

    Respective files:

    1. Project Page
    2. GraphQL

    First step

    Within getStaticPaths, fetch all project identifiers to tell Next.js what params to expect/pre-render:

    export const getStaticPaths = async () => {
        const slugs = await getProjectSlugs();
    
        const paths = slugs.map(item => ({
            params: { slug: item.slug },
        }));
    
        // Pre-render only fetched paths at build time.
        // Server-side render on demand if the path doesn't exist.
        return { paths, fallback: "blocking" };
    };

    // query.ts
    export const PROJECT_SLUGS = gql`
        query GetProjectSlugs {
            projectCollection {
                items {
                    slug
                }
            }
        }
    `;
    
    // slugs.ts
    export const getProjectSlugs = async (): Promise<Projects.ProjectSlug[]> => {
        const { data } = await client.query<ProjectSlugsCollection>({ query: PROJECT_SLUGS });
        return extractProjectsSlugsCollection(data);
    };

    Second step

    Within getStaticProps, use params to fetch the respective product on build:

    export const getStaticProps: GetStaticProps = async ({ params }) => {
        const slug = params?.slug;
    
        if (!slug || "string" !== typeof slug) {
            return {
                notFound: true,
            };
        }
    
        const project = await getProjectBySlug(slug);
    
        if (!project) {
            return {
                notFound: true,
            };
        }
    
        return {
            props: { project },
            revalidate: 60 * 60 * 24,
        };
    };

    Getting Started

    Run the development server:

    npm run dev
    # or
    yarn dev

    Open http://localhost:3000 with your browser to see the result.

    Visit original content creator repository
    https://github.com/timfuhrmann/next-graphql-apollo-contentful

  • i18n.cr

    i18n – Internationalization Library for Crystal

    Build Status license tag

    Installation

    Add this to your application’s shard.yml:

    dependencies:
      i18n:
        github: crimson-knight/i18n.cr

    Usage

    I18n.translate(
      "some.dot.separated.path",  # key : String
      {attr_to_interpolate: "a"}, # options : Hash | NamedTuple? = nil
      "pt",                       # force_locale : String = nil
      2,                          # count : Numeric? = nil
      "default translation",      # default : String? = nil
      nil                         # iter : Int? = nil
    )
    
    I18n.localize(
      Time.utc_now, # object : _
      "pt",         # force_locale : String = I18n.config.locale
      :time,        # scope : Symbol? = :number
      "long"        # format : String? = nil
    )

    Arguments interpolation

    Translation may include argument interpolation. For doing this use regular crystal named interpolation placeholder and pass hash or named tuple as a options argument:

    message:
      new: "New message: %{text}"
    # New message: hello
    I18n.translate("message.new", {text: "hello"})
    I18n.translate("message.new", {:text => "hello"})
    I18n.translate("message.new", {"text" => "hello"})

    Also any extra key-value pair will be ignored and missing one will not cause any exception:

    I18n.translate("message.new", {message: "hello"}) # New message: %{text}

    Configuration

    require "i18n"
    
    I18n.load_path += ["spec/locales"]
    I18n.init # This will load locales from all specified locations
    
    I18n.default_locale = "pt" # default can be set after loading translations

    There is a handler for Kemalyst that bring I18n configuration.

    Available backends

    I18n::Backend::Yaml

    A simple backend that reads translations from YAML files and stores them in an in-memory hash. Also supports JSON files and translations embedding.

    I18n::Backend::Chain

    Backend that chains multiple other backends and checks each of them when a translation needs to be looked up. This is useful when you want to use standard translations with a custom backend to store custom application translations in a database or other backends.

    To use the Chain backend instantiate it and set it to the I18n module. You can add chained backends through the initializer:

    require "i18n/backend/chain"
    
    other_backend = I18n::Backend::Yaml.new # your other backend
    I18n.backend.load("config/locales/en.yml")
    other_backend.load("config/locales/pl.yml")
    I18n.backend = I18n::Backend::Chain.new([I18n.backend, other_backend2] of I18n::Backend::Base)
    
    # or if it is ok to pass files to each backend
    
    I18n.backend = I18n::Backend::Chain.new([I18n.backend, I18n::Backend::Yaml.new] of I18n::Backend::Base)
    I18n.load_path = ["config/locales/{en,pl}.yml"]
    I18n.load

    I18n::Backend::Fallback

    I18n locale fallbacks are useful when you want your application to use translations from other locales when translations for the current locale are missing. E.g. you might want to use en translations when translations in your applications main locale de are missing.

    To enable locale fallbacks you can instantiate fallback backend giving it your backend as an argument:

    require "i18n/backend/fallback"
    
    I18n.load_path = ["config/locales"]
    I18n.init
    I18n.backend = I18n::Backend::Fallback.new(I18n.backend, {"en-US" => "en", "en-UK" => "en"})

    Note on YAML Backend

    Putting translations for all parts of your application in one file per locale could be hard to manage. You can store these files in a hierarchy which makes sense to you.

    For example, your config/locales directory could look like this:

    locales
    |--defaults
    |----en.yml
    |----pt.yml
    |--models
    |----en.yml
    |----pt.yml
    |--views
    |----users
    |------en.yml
    |------pt.yml

    This way you can separate model related translations from the view ones. To require all described subfolders at once use **I18n.load_path += ["locals/**/"]

    Any .json file located in the file hierarchy specified for load_path is also read and parsed.

    Date/Time Formats

    To localize the time (or date) format you should pass Time object to the I18n.localize. To pick a specific format path format argument:

    I18n.localize(Time.local, scope: :date, format: :long)

    By default Time will be localized with :time scope.

    To specify formats and all need localization information (like day or month names) fill your file in following way:

    NOTE: According to ISO 8601, Monday is the first day of the week

    __formats__:
      date:
        formats:
          default: '%Y-%m-%d' # is used by default
          long: '%A, %d de %B %Y'
        month_names: # long month names
          - Janeiro
          - Fevereiro
          - Março
          - Abril
          - Maio
          - Junho
          - Julho
          - Agosto
          - Setembro
          - Outubro
          - Novembro
          - Dezembro
        abbr_month_names: # month abbreviations
          - Jan
          - Fev
          # ...
        day_names: # fool day names
          - Segunda
          # ...
        abbr_day_names: # short day names
          - Seg
          # ...

    Format accepts any crystal Time::Format directives. Also following directives will be automatically localized:

    Directive Description Key
    %a short day name date.abbr_day_names
    %A day name date.day_names
    %b short month name date.abbr_month_names
    %B month name date.month_names
    %p am-pm (lowercase) time.am/time.pm
    %P AM-PM (uppercase) time.am/time.pm

    Pluralization

    In many languages — including English — there are only two forms, a singular and a plural, for a given string, e.g. “1 message” and “2 messages”. Other languages (Arabic, Japanese, Russian and many more) have different grammars that have additional or fewer plural forms.

    The count interpolation variable has a special role in that it both is interpolated to the translation and used to pick a pluralization from the translations according to the pluralization rules defined by CLDR:

    message:
      one: "%{count} message"
      other: "%{count} messages"
    I18n.translate("message", count: 1) # 1 message
    I18n.translate("message", count: 2) # 2 messages
    I18n.translate("message", count: 0) # 0 messages

    count should be passed as argument – not inside of options. Otherwise regular translation lookup will be applied.

    I18n defines default CLDR rules for many locales (see src/i18n/config/plural_rules), however they can be overwritten:

    I18n.plural_rules["ru"] = ->(n : Int32) {
      if n == 0
        :zero
      elsif ((n % 10) == 1) && ((n % 100 != 11))
        # 1, 21, 31, 41, 51, 61...
        :one
      elsif ([2, 3, 4].includes?(n % 10) && ![12, 13, 14].includes?(n % 100))
        # 2-4, 22-24, 32-34...
        :few
      elsif ((n % 10) == 0 || ![5, 6, 7, 8, 9].includes?(n % 10) || ![11, 12, 13, 14].includes?(n % 100))
        # 0, 5-20, 25-30, 35-40...
        :many
      else
        :other
      end
    }
    kid:
      zero: 'нет детей'
      one: '%{count} ребенок'
      few: '%{count} ребенка'
      many: '%{count} детей'
      other: '%{count} детей'
    I18n.locale = "ru"
    
    I18n.translate("kid", count: 0) # нет детей
    I18n.translate("kid", count: 1) # 1 ребенок
    I18n.translate("kid", count: 2) # 2 ребенка
    I18n.translate("kid", count: 6) # 6 детей

    Iteration

    To store several alternative objects under one localization key they could be just listed in the file and later retrieved using iter argument:

    NOTE : The first index is 0

    __formats__:
      date:
        day_names: [Monday, Tuesday, Wednesday, Thursday, Friday, Saturday, Sunday]
    I18n.translate("__formats__.date.day_names", iter: 2)  # >>> "Wednesday"

    Embedding translations inside your binary

    Changes from 0.4.0 to 0.4.1

    • Supported Crystal versions of >= 0.35
    • add Backend::Yaml#exists? to check whether given translation key exists
    • add I18n.exists?
    • add docs to I18n module public methods (most of wording was taken from the ruby I18n repo)
    • Fixed the iter argument in translate to properly return the correct index

    Example

      # Array we are looking into ["Monday", "Tuesday", "Wednesday", "Thursday", "Friday", "Saturday", "Sunday"]
      I18n.translate("__formats__.date.day_names", iter: 2) # Returns "Wednesday"

    Breaking changes from 0.3 to 0.4

    • Pluralization rules are now fully suites CLDR convention. Specifically en pluralization no more returns zero

    Breaking changes from 0.2 to 0.3

    • The first day of the week is now Monday according to ISO 8601.
    • The nil value in month_names and abbr_month_names was removed.

    You can embed translations inside your binary by using the following macro call:

    I18n::Backend::Yaml.embed(["some/locale/directory", "some/other/locale/directory"])

    Contributing

    1. Fork it ( https://github.com/crimson-knight/i18n/fork )
    2. Create your feature branch (git checkout -b my-new-feature)
    3. Commit your changes (git commit -am ‘Add some feature’)
    4. Push to the branch (git push origin my-new-feature)
    5. Create a new Pull Request

    Contributors

    Inspiration taken from:

    Special thank you to [TechMagister] for being the original owner and creator of this shard.

    Visit original content creator repository https://github.com/crimson-knight/i18n.cr
  • loopback-component-meta

    loopback-component-meta

    Greenkeeper badge

    CircleCI Dependencies Coverage Status

    Component for LoopBack that adds a Meta model that can be used to retrieve meta data about the model definitions.

    Installation

    Install the module

    $ npm install --save loopback-component-meta
    

    Configure the module in server/component-config.json

    {
      "loopback-component-meta": {
        "enableRest": true,
        "filter": [
          "ACL",
          "AccessToken",
          "RoleMapping",
          "Role",
          "User"
        ],
        "acls": [{
          "accessType": "*",
          "principalType": "ROLE",
          "principalId": "$unauthenticated",
          "permission": "ALLOW"
        }]
      }
    }
    

    Usage

    After installation you should be able to retrieve data about your models through the Meta endpoint on your API:

    Retrieve all models:

    http://0.0.0.0:3000/api/Meta

    [{
    	id: "Category",
    	name: "Category",
    	properties: {
    		name: {
    			type: "String",
    			required: true
    		}
    	},
    	acls: [],
    	base: "BaseModel",
    	idInjection: true,
    	methods: {},
    	mixins: {
    		ModifiedTimestamp: {}
    	},
    	relations: {
    		products: {
    			type: "hasMany",
    			model: "Product",
    			foreignKey: ""
    		}
    	},
    	strict: false,
    	validations: []
    }, {
    	id: "Product",
    	name: "Product",
    	properties: {
    		name: {
    			type: "String",
    			required: true
    		}
    	},
    	acls: [],
    	base: "BaseModel",
    	idInjection: true,
    	methods: {},
    	mixins: {
    		ModifiedTimestamp: {}
    	},
    	relations: {
    		category: {
    			type: "belongsTo",
    			model: "Category",
    			foreignKey: ""
    		}
    	},
    	strict: false,
    	validations: []
    }]
    

    Retrieve one model:

    http://0.0.0.0:3000/api/Meta/Category

    {
    	id: "Category",
    	name: "Category",
    	properties: {
    		name: {
    			type: "String",
    			required: true
    		}
    	},
    	acls: [],
    	base: "BaseModel",
    	idInjection: true,
    	methods: {},
    	mixins: {
    		ModifiedTimestamp: {}
    	},
    	relations: {
    		products: {
    			type: "hasMany",
    			model: "Product",
    			foreignKey: ""
    		}
    	},
    	strict: false,
    	validations: []
    }
    
    Visit original content creator repository https://github.com/fullcube/loopback-component-meta
  • yieldscan-frontend

    YieldScan

    Maximizing yield on staking. Starting with Kusama.

    Table of contents

    Currently supported networks

    Description

    We aim to solve the problems of information asymmetry in identifying and optimizing returns on staking, reducing time and capital costs for stakers to make staking decisions.

    This project is funded and supported by the Web3 Foundation – under Wave 6.

    Usage

    Pre-requisites

    • PolkadotJS browser extension
    • At least one account on Kusama with enough balance to pay for transaction fees and bond funds.

    Currently, the app can be used on https://yieldscan.onrender.com/, but the domain is likely to change and shall be updated here.

    ⚠️ IMPORTANT: Please note that this project is in early beta stage and bugs and issues are to be expected.

    Borrowing from Kusama’s tagline – “Expect Chaos

    Usage Instructions

    1. Go to YieldScan. You will be greeted with the following page: YieldScan Landing Page

    2. Enter your budget and click “Calculate Returns”. You will be redirected to the returns calculator, which will show you your potential earnings and allow you to tweak your staking preferences to get varied results: Return Calculator - Wallet Not Connected

    3. Once you’re satisfied with your preferences and inputs, simply connect your PolkadotJS wallet by clicking either the “Connect wallet” button on the header or by clicking the “Connect wallet to stake” button in returns card. This will prompt you connect your wallet: Wallet Connection Popup

    Click on “Connect my wallet”. You will be prompted by PolkadotJS to authorize YieldScan – this authorization is needed for us to prompt you to sign transactions – this keeps your keys safe and allows you to be in control of whether or not you want any transaction to be sent to the chain.

    1. Once you’ve authorized the app, simply select an account for staking and you’ll be ready to proceed forward from the return calculator.

    2. Simply click the “Stake” button on the returns calculator and you’ll be redirected to the payment confirmation page: Payment Confirmation

    3. Once you’re satisfied with the selected preferences, click on confirm and read the terms of service – please make sure you understand the risks before you proceed any further. Once you understand the risks and agree to the terms, you’ll be taken to the reward destination selection step. If you have decided to compound your rewards on the calculator, you can simply proceed forward. If you decided to not compound your rewards and plan to use a separate controller for staking, then select the reward destination of your choice (i.e. stash or controller) where you would like for you rewards to be awarded and then proceed. Reward Destination

    4. Finally, you’ll be asked to confirm the staking amount and the account(s) being used for nomination. You can edit the controller here if you like or use the default selection – i.e. same account for stash as well as controller. Account confirmation

    5. Click on “Transact and Stake” and you’ll be prompted by the PolkadotJS extension to sign the transaction: Transaction Signing

    6. Congratulations! You’re now a nominator:

    Success

    1. On clicking proceed, you’ll be redirected to your staking dashboard where you can see your expected returns, manage your nominations, unbond or bond more funds and change the payment destination:

    Dashboard

    Development

    Getting Started

    • Clone the repository:

       git clone https://github.com/buidl-labs/yieldscan-frontend
    • Install the dependencies:

       npm install
       # or
       yarn
    • Add environment variables in .env.local

       # Main API endpoint
       NEXT_PUBLIC_API_BASE_URL=<base-url-of-deployed/local-api>
      
       # Tracking
       NEXT_PUBLIC_AMPLITUDE_API_TOKEN=<your-amplitude-api-token> # For development you can pass a string like "none" - to prevent unnecessary data from being logged
       NEXT_PUBLIC_METOMIC_PROJECT_ID=<your-metomic-project-id>
      
       # Sentry (optional)
       NEXT_PUBLIC_SENTRY_DSN=<your-sentry-dsn>
      
       # Only required to upload sourcemaps
      
       SENTRY_ORG=<your-sentry-org>
       SENTRY_PROJECT=<your-sentry-project>
       SENTRY_AUTH_TOKEN=<your-sentry-auth-token>

      Note: You can checkout the backend codebase here.

      Useful resources:

    • Run the development server:

       npm run dev
       # or
       yarn dev

      Open http://localhost:3000 with your browser to see the result.

      You can start editing any page by modifying pages/<page>.js. The page auto-updates as you edit the file.

    • Creating a new user flow?

      • Create a new page in pages/<page>.js
      • Create a layout if needed in components/common/layouts else use components/common/layouts/base.js
      • Create the page’s root component in components/<page>/index.js

    Git commit

    Run npm run git:commit for commiting your code and follow the process

    Learn More

    To learn more about Next.js, take a look at the following resources:

    Gratitude

    Visit original content creator repository https://github.com/buidl-labs/yieldscan-frontend
  • yieldscan-frontend

    YieldScan

    Maximizing yield on staking. Starting with Kusama.

    Table of contents

    Currently supported networks

    Description

    We aim to solve the problems of information asymmetry in identifying and optimizing returns on staking, reducing time and capital costs for stakers to make staking decisions.

    This project is funded and supported by the Web3 Foundation – under Wave 6.

    Usage

    Pre-requisites

    • PolkadotJS browser extension
    • At least one account on Kusama with enough balance to pay for transaction fees and bond funds.

    Currently, the app can be used on https://yieldscan.onrender.com/, but the domain is likely to change and shall be updated here.

    ⚠️ IMPORTANT: Please note that this project is in early beta stage and bugs and issues are to be expected.

    Borrowing from Kusama’s tagline – “Expect Chaos

    Usage Instructions

    1. Go to YieldScan. You will be greeted with the following page: YieldScan Landing Page

    2. Enter your budget and click “Calculate Returns”. You will be redirected to the returns calculator, which will show you your potential earnings and allow you to tweak your staking preferences to get varied results: Return Calculator - Wallet Not Connected

    3. Once you’re satisfied with your preferences and inputs, simply connect your PolkadotJS wallet by clicking either the “Connect wallet” button on the header or by clicking the “Connect wallet to stake” button in returns card. This will prompt you connect your wallet: Wallet Connection Popup

    Click on “Connect my wallet”. You will be prompted by PolkadotJS to authorize YieldScan – this authorization is needed for us to prompt you to sign transactions – this keeps your keys safe and allows you to be in control of whether or not you want any transaction to be sent to the chain.

    1. Once you’ve authorized the app, simply select an account for staking and you’ll be ready to proceed forward from the return calculator.

    2. Simply click the “Stake” button on the returns calculator and you’ll be redirected to the payment confirmation page: Payment Confirmation

    3. Once you’re satisfied with the selected preferences, click on confirm and read the terms of service – please make sure you understand the risks before you proceed any further. Once you understand the risks and agree to the terms, you’ll be taken to the reward destination selection step. If you have decided to compound your rewards on the calculator, you can simply proceed forward. If you decided to not compound your rewards and plan to use a separate controller for staking, then select the reward destination of your choice (i.e. stash or controller) where you would like for you rewards to be awarded and then proceed. Reward Destination

    4. Finally, you’ll be asked to confirm the staking amount and the account(s) being used for nomination. You can edit the controller here if you like or use the default selection – i.e. same account for stash as well as controller. Account confirmation

    5. Click on “Transact and Stake” and you’ll be prompted by the PolkadotJS extension to sign the transaction: Transaction Signing

    6. Congratulations! You’re now a nominator:

    Success

    1. On clicking proceed, you’ll be redirected to your staking dashboard where you can see your expected returns, manage your nominations, unbond or bond more funds and change the payment destination:

    Dashboard

    Development

    Getting Started

    • Clone the repository:

       git clone https://github.com/buidl-labs/yieldscan-frontend
    • Install the dependencies:

       npm install
       # or
       yarn
    • Add environment variables in .env.local

       # Main API endpoint
       NEXT_PUBLIC_API_BASE_URL=<base-url-of-deployed/local-api>
      
       # Tracking
       NEXT_PUBLIC_AMPLITUDE_API_TOKEN=<your-amplitude-api-token> # For development you can pass a string like "none" - to prevent unnecessary data from being logged
       NEXT_PUBLIC_METOMIC_PROJECT_ID=<your-metomic-project-id>
      
       # Sentry (optional)
       NEXT_PUBLIC_SENTRY_DSN=<your-sentry-dsn>
      
       # Only required to upload sourcemaps
      
       SENTRY_ORG=<your-sentry-org>
       SENTRY_PROJECT=<your-sentry-project>
       SENTRY_AUTH_TOKEN=<your-sentry-auth-token>

      Note: You can checkout the backend codebase here.

      Useful resources:

    • Run the development server:

       npm run dev
       # or
       yarn dev

      Open http://localhost:3000 with your browser to see the result.

      You can start editing any page by modifying pages/<page>.js. The page auto-updates as you edit the file.

    • Creating a new user flow?

      • Create a new page in pages/<page>.js
      • Create a layout if needed in components/common/layouts else use components/common/layouts/base.js
      • Create the page’s root component in components/<page>/index.js

    Git commit

    Run npm run git:commit for commiting your code and follow the process

    Learn More

    To learn more about Next.js, take a look at the following resources:

    Gratitude

    Visit original content creator repository https://github.com/buidl-labs/yieldscan-frontend
  • p2p-mirotalk

    MiroTalk P2P

    Free WebRTC – P2P – Simple, Secure, Fast Real-Time Video Conferences Up to 4k and 60fps, compatible with all browsers and platforms.


    p2p.mirotalk.com



    Features
    • Is 100% FreeOpen SourceSelf Hosted and PWA!
    • No download, plug-in, or login required, entirely browser-based
    • Unlimited number of conference rooms without call time limitation
    • Translated in 133 languages
    • Host protection to ensure unauthorized access to your host
    • Possibility to Password protect the Room for the meeting
    • Desktop and Mobile compatible
    • Optimized Room URL Sharing for mobile
    • Webcam Streaming (Front – Rear for mobile)
    • Audio Streaming crystal clear with detect speaking and volume indicator
    • Screen Sharing to present documents, slides, and more…
    • File Sharing (with drag-and-drop), share any files to your participants in the room
    • Select Audio Input – Output and Video source
    • Ability to set video quality up to 4K and 60 FPS
    • Recording your Screen, Audio and Video
    • Snapshot the video frame and save it as image png
    • Chat with Emoji Picker to show you feeling, private messages, Markdown support, possibility to Save the conversations, and many more
    • ChatGPT (openAI), designed to answer users’ questions, provide relevant information, and connect them with relevant resources
    • Speech recognition to send the speeches
    • Push to talk, like a walkie-talkie.
    • Advance collaborative whiteboard for the teachers
    • Share any YT Embed video, video mp4, webm, ogg and audio mp3 in real-time
    • Full-Screen Mode on mouse click on the Video element, Pin/Unpin, Zoom in-out video element
    • Possibility to Change UI Themes
    • Right-click on the Video elements for more options
    • Direct peer-to-peer connection ensures the lowest latency thanks to WebRTC
    • Supports REST API (Application Programming Interface)
    • Slack API integration
    • Sentry error reporting
    About
    Start videoconference
    Direct Join
    Embed a meeting

    Embedding a meeting into a service or app using an iframe.

    <iframe
        allow="camera; microphone; display-capture; fullscreen; clipboard-read; clipboard-write; autoplay"
        src="https://p2p.mirotalk.com/newcall"
        style="height: 100%; width: 100%; border: 0px;"
    ></iframe>
    Quick start
    • You will need to have Node.js installed, this project has been tested with Node versions 12.X, 14.X, 16.X and 18.x.
    # clone this repo
    $ git clone https://github.com/miroslavpejic85/mirotalk.git
    # go to mirotalk dir
    $ cd mirotalk
    # copy .env.template to .env (edit it according to your needs)
    $ cp .env.template .env
    # install dependencies
    $ npm install
    # start the server
    $ npm start
    Docker

    docker

    # copy .env.template to .env (edit it according to your needs)
    $ cp .env.template .env
    # Copy docker-compose.template.yml in docker-compose.yml (edit it according to your needs)
    $ cp docker-compose.template.yml docker-compose.yml
    # Get official image from Docker Hub
    $ docker pull mirotalk/p2p:latest
    # create and start containers
    $ docker-compose up # -d
    # to stop and remove resources
    $ docker-compose down
    Ngrok – Https

    You can start videoconferencing directly from your Local PC, and be reachable from any device outside your network, simply by reading these documentation, or expose it directly on HTTPS

    Stun & Turn

    Install your own Stun & Turn by following this steps.

    Rest API
    # The response will give you a entrypoint / Room URL for your meeting, where authorization: API_KEY_SECRET.
    $ curl -X POST "http://localhost:3000/api/v1/meeting" -H "authorization: mirotalk_default_secret" -H "Content-Type: application/json"
    $ curl -X POST "https://p2p.mirotalk.com/api/v1/meeting" -H "authorization: mirotalk_default_secret" -H "Content-Type: application/json"
    $ curl -X POST "https://mirotalk.up.railway.app/api/v1/meeting" -H "authorization: mirotalk_default_secret" -H "Content-Type: application/json"
    $ curl -X POST "https://mirotalk.herokuapp.com/api/v1/meeting" -H "authorization: mirotalk_default_secret" -H "Content-Type: application/json"

    API Documentation

    The API documentation uses swagger at http://localhost:3000/api/v1/docs. Or check it out on live & heroku.

    Hetzner

    Hetzner

    This application is running for demonstration purposes on Hetzner, one of the best cloud providers and dedicated root servers.


    Use my personal link to receive €⁠20 IN CLOUD CREDITS.


    If you need help to deploy MiroTalk P2P instance on your dedicated cloud server, or for other needs, don’t hesitate to contact us at p2p.mirotalk@gmail.com

    Live Demos

    https://p2p.mirotalk.com

    hetzner-qr


    https://mirotalk.up.railway.app

    railway-qr


    Heroku Deploy

    https://mirotalk.herokuapp.com

    heroku-qr

    If you want to deploy a MiroTalk P2P instance on your dedicated server, or for other needs, don’t hesitate to contact us at p2p.mirotalk@gmail.com.

    Self Hosting

    To Self-Host MiroTalk P2P on Your dedicated Server, read this documentation.

    Security

    For Security concerning, please follow this documentation.

    Credits
    • ianramzy (html template)
    • vasanthv (webrtc-logic)
    • fabric.js (whiteboard)
    Contributing
    • Contributions are welcome and greatly appreciated!
    • Just run before npm run lint
    Questions, Discussions and support
    • For questions, discussions, help & support, join with us on Discord
    License

    AGPLv3

    MiroTalk is free and can be modified and forked. But the conditions of the AGPLv3 (GNU Affero General Public License v3.0) need to be respected. In particular modifications need to be free as well and made available to the public. Get a quick overview of the license at Choose an open source license.

    For a MiroTalk license under conditions other than AGPLv3, please contact us at license.mirotalk@gmail.com or purchase directly from CodeCanyon.

    Support the project

    Do you find MiroTalk useful?

    Support the project by becoming a backer or sponsor. Your logo will show up here with a link to your website.

    BroadcastX Hetzner
    LuvLounge QuestionPro
    BrowserStack

    MiroTalk SFU

    Try also MiroTalk SFU, the difference between the two projects you can found here.

    MiroTalk C2C

    Try also MiroTalk C2C cam 2 cam.

    MiroTalk WEB

    Try also MiroTalk WEB rooms scheduler.

    This project is tested with BrowserStack.

    Visit original content creator repository https://github.com/Arthur-Nitzz/p2p-mirotalk
  • postgresql-11-block

    postgresql-11-block

    Current status of this block. I use it in one fleet and works pretty well for my purposes.

    You have to run the database setup and initialization and migrations from one of the other
    containers, which means installing psql in the other container.

    Server configuration:

    The block copies all the files from a shared/persistent directory named /postgresql_shared/conf to the postgresql configuration directory.
    If you want to override any of the postgresql defaults, put the entire configuration file in the shared directory. To automate this,
    have your own container copy the configuration file to the shared directory and have postgresql DEPEND on that
    container in the dockercompose.yml file. For example:

      mycontainer: # copies config files to /activemq_conf
        volumes:
          - 'resin-data:/postgresql_conf'
    
      postgresql:
        image: bh.cr/g_john_rodley1/postgresql-11-block
        volumes:
          - 'resin-data:/postgresql_conf'
        depends-on:
          - mycontainer  # don't start postgres until mycontainer has copied the right files into /postgresql_conf
    

    The configuration files of most interest are postgresql.conf and pg_hba.conf. The repo contains
    a sample of each.

    postgresql.conf:

    • data_directory – if you leave the default data directory which points to containerized storage then your database will disappear on every deployment.

    pg_hba.conf:

    You will likely want to add lines allowing both your local network and other containers to access
    the database. See the sample. Example:

    host    all             all             192.168.21.237/32            md5
    host    all             all             172.0.0.0/8            md5
    

    Database initialization and migration:

    All application-specific database work are the responsibility of the client container.

    Environment variables user in this container:

    • POSTGRESQL_UNIX_USER unix username of the user who owns/runs the postgresql processes/files, typically “postgres”. Only tested where POSTGRESQL_UNIX_USER and POSTGRESQL_POSTGRES_USER are both set to “postgres”
    • POSTGRESQL_POSTGRES_USER POSTGRES username of the user who owns and has ALL rights within the postgresql installation, typically “postgres”.
    • POSTGRESQL_POSTGRES_PASSWORD postgres password (not unix password) for the postgres user IN POSTGRES. The block resets the postgres user password to this value on startup.
    • POSTGRESQL_SYSTEM_DBNAME name of the system database, typically postgres
    • POSTGRESQL_SHARED_DIRECTORY full path to root shared directory (among containers) under which postgresql container will create additional directories to run postgres, typically /postgresql_shared

    Visit original content creator repository
    https://github.com/bubblesnet/postgresql-11-block