Compare commits

..

4 Commits
master ... trie

Author SHA1 Message Date
vi 83c4cf1227 Supercompiled rule tries. 2018-06-30 19:06:31 -06:00
vi 0bbc68493f Some kit for radix tries. 2018-06-28 21:35:56 -06:00
vi 0ec419feca Cull tests against volatile interfaces. 2018-06-28 01:45:41 -06:00
vi b93ff6bbb6 Modern resolve. 2018-06-28 00:08:17 -06:00
19 changed files with 569 additions and 392 deletions

View File

@ -1,51 +1,16 @@
https-everywhere-rules
======================
[https-everywhere-rules](https://github.com/fmap/https-everywhere-rules)
========================================================================
Unencrypted HTTP requests leave users vulnerable to surveillance, content filtering, and in some cases content tampering. While the number of websites making encrypted connections available are steadily increasing, only a fraction of those send visitors to their HTTPS version by default. While efforts to encourage broader and safer adoption of HTTPS-as-default are praiseworthy, we hold that automatic client-local TLS upgrades are a worthwhile privacy measure for the forseeable future.
Haskell package providing high-level access to [HTTPS Everywhere][1]
rulesets. It aims to make it easier to specify: "use secure HTTP
connections when possible."
[HTTPS Everywhere](https://www.eff.org/https-everywhere) provides these facilities within the two most common web browsers. But there exist HTTP clients which arent browsers, and unsupported browsers. We wish to supply broader protections; discouraging adversaries from intercepting and manipulating traffic from a wider variety of clients. This library serves as the bedrock of that effort; it provides high-level programmatic access to the HTTPS Everywhere rules database, making it easier to specify "substitute this HTTP endpoint for a corresponding secure endpoint, if you know how."
It's very much a work-in-progress now, though you are very welcome to [contribute](#contributing) to that progress.
Usage
-----
The interface is straightforward:
[1]: https://www.eff.org/https-everywhere
```haskell
λ: :m + Data.HTTPSEverywhere.Rules Network.URI
λ: rules <- getRulesets
λ: let Just it = parseURI "http://httpsnow.org"
λ: rewriteURL rules it
λ: :m + Data.HTTPSEverywhere.Rules Network.URI
λ: rules <- getRulesets
λ: let Just it = parseURI "http://httpsnow.org"
λ: rewriteURL rules it
Just https://httpsnow.org
```
There exist demonstration programs:
- [`redirect-proxy`](examples/RedirectProxy.hs) implements a naive redirecting HTTP proxy, writing 302s for any URI HTTPS Everywhere knows how to upgrade.
```
% nohup redirect-proxy &>/dev/null &
[1] 674
% curl -w '%{remote_ip} %{http_code} %{redirect_url}\n' -x localhost:3100 httpsnow.org
127.0.0.1 302 https://httpsnow.org/
```
- [`count-upgradeable`](examples/CountUpgradeable.hs) announces the fraction of URIs received on standard input that HE knows how to upgrade:
```
% find vendor/https-everywhere/src/chrome/content/rules -name '*.xml' | shuf -n100 > some_rules
% < some_rules xargs grep default_off | wc -l
20
% < some_rules xargs grep -Pohr '(?<=test url=")[^"]+(?=")' | count-upgradeable
(44,76)
```
Contributing
------------
First of all, thank you.
This is all fairly immature as-is, but you're more than welcome to plod in. The best step you can take right now is to drop by the IRC channel [#https-everywhere-else](irc://irc.freenode.net/#https-everywhere-else) on Freenode: indicate what you want to be doing, and we'll work out how to make it happen; indicate any uncertainty, and we will try to clarify it.
Our current focus is on performance improvements. The current implementation relies on a linear search of the ruleset upon each rewrite attempt; this is compromisingly inefficient. We're going to modify the library to perform search on a supercompiled suffix tree, where rather than lookup being on the order of the number of rules, it will be on the order of the depth of the domain - the previous cost being incurred once at compile time.

View File

@ -6,7 +6,7 @@ import "foldl" Control.Foldl (Fold(..))
import qualified "foldl" Control.Foldl as Foldl
import "base" Control.Monad (forever)
import "base" Data.Bool (bool)
import "https-everywhere-rules" Data.HTTPSEverywhere.Rules (rewriteURL, RuleSet, getRulesets)
import "https-everywhere-rules" Data.HTTPSEverywhere.Rules (rewriteURL)
import "base" Data.Maybe (isJust)
import "base" Data.Monoid (Sum(..))
import "network-uri" Network.URI (URI, parseURI)
@ -17,8 +17,8 @@ import qualified "pipes" Pipes.Prelude as Pipes (fold)
parse :: Monad m => Pipe String URI m ()
parse = forever $ parseURI <$> await >>= maybe (return ()) yield
check :: [RuleSet] -> Pipe URI Bool IO ()
check rs = forever $ await >>= yield . isJust . rewriteURL rs
check :: Pipe URI Bool IO ()
check = forever $ await >>= yield . isJust . rewriteURL
fold :: Monad m => Fold a b -> Producer a m () -> m b
fold (Fold step begin done) = Pipes.fold step begin done
@ -27,6 +27,4 @@ proportion :: Fold Bool (Int, Int)
proportion = (,) <$> Foldl.foldMap (Sum . bool 0 1) getSum <*> Foldl.length
main :: IO ()
main = do
rulesets <- getRulesets
fold proportion (stdinLn >-> parse >-> check rulesets) >>= print
main = fold proportion (stdinLn >-> parse >-> check) >>= print

View File

@ -7,22 +7,26 @@ module Main (main) where
import "base" Data.Maybe (maybe)
import "base" Data.Monoid ((<>))
import "bytestring" Data.ByteString.Char8 (pack, unpack)
import "http-proxy" Network.HTTP.Proxy (runProxySettings, defaultProxySettings, Settings(..), Request(..))
import "http-proxy" Network.HTTP.Proxy
(runProxySettings, defaultProxySettings, Settings(..), Request(..))
import "http-types" Network.HTTP.Types (status302)
import "https-everywhere-rules" Data.HTTPSEverywhere.Rules (getRulesets, rewriteURL, RuleSet)
import "https-everywhere-rules" Data.HTTPSEverywhere.Rules (rewriteURL)
import "network-uri" Network.URI (URI, parseURI)
import "wai" Network.Wai (Response, responseLBS)
httpRedirect :: URI -> Response
httpRedirect to = responseLBS status302 [("Location", pack . show $ to)] ""
httpRedirect to =
responseLBS status302 [("Location", pack . show $ to)] ""
tryHTTPS :: [RuleSet] -> Request -> Either Response Request
tryHTTPS rs Request{..} = case parseURI . unpack $ requestPath <> queryString of
Just uri -> maybe (return Request{..}) (Left . httpRedirect) (rewriteURL rs uri)
Nothing -> return Request{..}
tryHTTPS :: Request -> Either Response Request
tryHTTPS Request{..} =
case parseURI . unpack $ requestPath <> queryString of
Just uri -> maybe
(return Request{..})
(Left . httpRedirect)
(rewriteURL uri)
Nothing -> return Request{..}
main :: IO ()
main = do
rules <- getRulesets
runProxySettings $ defaultProxySettings
{ proxyRequestModifier = return . tryHTTPS rules }
main = runProxySettings $ defaultProxySettings
{ proxyRequestModifier = return . tryHTTPS }

View File

@ -13,7 +13,7 @@ license-file:
author:
vi
maintainer:
vi@zalora.com
vi@computational.law
category:
Data
build-type:
@ -624,26 +624,33 @@ library
Data.HTTPSEverywhere.Rules.Internal,
Data.HTTPSEverywhere.Rules.Internal.Parser,
Data.HTTPSEverywhere.Rules.Internal.Raw,
Data.HTTPSEverywhere.Rules.Internal.Trie,
Data.HTTPSEverywhere.Rules.Internal.Trie.Supercompilation,
Data.HTTPSEverywhere.Rules.Internal.Types,
Data.HTTPSEverywhere.Rules.Internal.Types.Serialisable,
Data.ByteString.RadixTrie,
Data.Text.RadixTrie,
Data.Text.ICU.Extras,
Paths_https_everywhere_rules
build-depends:
base >= 4.7 && < 4.10,
attoparsec >= 0.12 && < 0.14,
directory >= 1.2 && < 1.4,
errors >= 1.4 && < 2.2,
filepath >= 1.3 && < 1.5,
functor-infix >= 0.0 && < 0.1,
http-client >= 0.3 && < 0.6,
lens >= 4.3 && < 4.16,
mtl >= 2.2 && < 2.3,
network >= 2.6 && < 2.7,
network-uri >= 2.6 && < 2.7,
pipes >= 4.1 && < 4.4,
string-conversions >= 0.3 && < 0.5,
taggy-lens >= 0.1 && < 0.2,
text >= 1.1 && < 1.3,
text-icu >= 0.6 && < 0.8
base,
attoparsec,
bytestring,
directory,
errors,
filepath,
http-client,
lens,
mtl,
network,
network-uri,
pipes,
taggy-lens,
template-haskell,
th-lift-instances,
text,
text-icu,
vector
hs-source-dirs:
src
default-language:
@ -655,24 +662,13 @@ test-suite spec
type:
exitcode-stdio-1.0
build-depends:
base >= 4.7 && < 4.10,
attoparsec >= 0.12 && < 0.14,
directory >= 1.2 && < 1.4,
errors >= 1.4 && < 2.2,
filepath >= 1.3 && < 1.5,
functor-infix >= 0.0 && < 0.1,
hspec >= 2.0 && < 2.5,
http-client >= 0.3 && < 0.6,
lens >= 4.3 && < 4.16,
mtl >= 2.2 && < 2.3,
network >= 2.6 && < 2.7,
network-uri >= 2.6 && < 2.7,
pipes >= 4.1 && < 4.4,
string-conversions >= 0.3 && < 0.5,
taggy-lens >= 0.1 && < 0.2,
text >= 1.1 && < 1.3,
text-icu >= 0.6 && < 0.8,
https-everywhere-rules
base,
attoparsec,
directory,
errors,
hspec,
text,
text-icu
main-is:
Spec.hs
hs-source-dirs:
@ -685,12 +681,6 @@ test-suite spec
ghc-options:
-Wall
other-modules:
Data.HTTPSEverywhere.Rules.Internal,
Data.HTTPSEverywhere.Rules.Internal.Parser,
Data.HTTPSEverywhere.Rules.Internal.ParserSpec,
Data.HTTPSEverywhere.Rules.Internal.Raw,
Data.HTTPSEverywhere.Rules.Internal.Types,
Data.HTTPSEverywhere.Rules.InternalSpec,
Data.Text.ICU.Extras,
Data.Text.ICU.ExtrasSpec
@ -709,7 +699,7 @@ executable redirect-proxy
RedirectProxy.hs
if flag(build-examples)
build-depends:
base >= 4.7 && < 4.10,
base,
bytestring,
http-proxy,
http-types,

View File

@ -0,0 +1,146 @@
{-# LANGUAGE DeriveFunctor #-}
{-# LANGUAGE DeriveLift #-}
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE RecordWildCards #-}
{-# LANGUAGE TupleSections #-}
{-# LANGUAGE ViewPatterns #-}
{-# OPTIONS_GHC -fno-warn-unused-top-binds #-}
module Data.ByteString.RadixTrie (
RadixTrie,
empty,
lookup,
lookupDefault,
elem,
enumerate,
singleton,
insert,
) where
import Prelude hiding
(null, lookup, drop, length, head, tail, take, length, last, elem)
import Data.ByteString
(ByteString, uncons, isPrefixOf, null, drop, length, head, tail,
take, length, last, append, cons)
import Data.Maybe (fromMaybe, isJust, maybeToList)
import Data.Vector (Vector, (!), generate, modify, imap)
import qualified Data.Vector as Vector (toList)
import qualified Data.Vector.Mutable as Mutable (modify)
import Language.Haskell.TH.Syntax (Lift)
import Instances.TH.Lift ()
data Leaf a =
Prefix ByteString (RadixTrie a) |
Split (Vector {- 2 ^ 8 -} (Maybe (RadixTrie a))) |
Vacuum deriving (Lift, Show, Functor)
data RadixTrie a =
RadixTrie { label :: Maybe a, children :: Leaf a }
deriving (Lift, Show, Functor)
empty :: RadixTrie a
empty = RadixTrie Nothing Vacuum
lookup :: ByteString -> (RadixTrie a -> Maybe a)
lookup str RadixTrie{..}
| null str = label
| otherwise = case children of
Prefix prefix child ->
if prefix `isPrefixOf` str
then drop (length prefix) str `lookup` child
else Nothing
Split vector -> do
(c, cs) <- uncons str
trie <- vector ! fromIntegral c
lookup cs trie
Vacuum -> Nothing
lookupDefault :: a -> ByteString -> (RadixTrie a -> a)
lookupDefault def = fmap (fmap (fromMaybe def)) lookup
elem :: ByteString -> (RadixTrie a -> Bool)
elem = fmap (fmap isJust) lookup
enumerate :: RadixTrie a -> [(ByteString, a)]
enumerate (RadixTrie label (Prefix prefix trie)) =
maybe [] (\x -> [("", x)]) label ++
map (\(x, y) -> (prefix `append` x, y)) (enumerate trie)
enumerate (RadixTrie label (Split vector)) =
maybe [] (\x -> [("", x)]) label ++
concat (Vector.toList
(imap (\i t -> maybe []
(map (\(x, y) ->
(cons (fromIntegral i) x, y)) . enumerate) t)
vector))
enumerate (RadixTrie a Vacuum) = map ("",) (maybeToList a)
(?=) :: Eq a => a -> a -> Maybe ()
m ?= n = if m == n then return () else Nothing
singleton' :: ByteString -> (RadixTrie a -> RadixTrie a)
singleton' str end
| null str = end
| length str == 1 = empty { children =
Split . generate (2 ^ (8 :: Int)) $ \i -> do
_ <- fromIntegral i ?= head str
return end }
| otherwise = empty { children = Prefix str end }
singleton :: ByteString -> a -> RadixTrie a
singleton str value = singleton' str (empty { label = return value })
data PrefixResult =
Equal |
Disjoint |
Extends ByteString |
Prefixes ByteString
prefixResult :: ByteString -> ByteString -> PrefixResult
prefixResult source target =
case (source `isPrefixOf` target, target `isPrefixOf` source) of
(True, True) -> Equal
(False, False) -> Disjoint
(True, False) -> Extends $ drop (length source) target
(False, True) -> Prefixes $ drop (length target) source
update :: Int -> (Maybe a -> a) -> (Vector (Maybe a) -> Vector (Maybe a))
update n f v = modify (\v' -> Mutable.modify v' (Just . f) n) v
insert :: ByteString -> a -> (RadixTrie a -> RadixTrie a)
insert (uncons -> Just (fromIntegral -> c, cs)) val
r@(RadixTrie _ (Split split)) = r { children = Split $
update c (maybe (singleton cs val) (insert cs val)) split }
insert _ val r@(RadixTrie _ (Split _)) = r { label = return val }
insert text val r@(RadixTrie label (Prefix prefix trie)) =
case prefixResult prefix text of
Equal -> r { children = Prefix text trie }
Extends e -> r { children = Prefix prefix $ insert e val trie }
Disjoint ->
r { children = Split . generate (2 ^ (8 :: Int)) $ \i ->
if i == fromIntegral (head text)
then return $ singleton (tail text) val
else if i == fromIntegral (head prefix)
then return $ singleton' (tail prefix) trie
else Nothing }
Prefixes p -> case length text of
0 -> r { label = return val }
1 -> r { children = Split . generate (2 ^ (8 :: Int)) $ \i ->
if i == fromIntegral (head text) then
return $ singleton (tail text) val
else if i == fromIntegral (head prefix) then
return $ singleton' (tail prefix) trie
else Nothing }
_ -> RadixTrie label . Prefix (take (length text) prefix) $
case length prefix - length text of
0 -> error "The impossible happened!"
1 -> RadixTrie (Just val)
(Split . generate (2 ^ (8 :: Int)) $ \i ->
fmap (const trie) (i ?= fromIntegral (last prefix)))
_ -> RadixTrie (Just val) (Prefix p trie)
insert text val (RadixTrie value Vacuum) =
(singleton' text (empty { label = return val })) { label = value }
example :: RadixTrie ByteString
example = foldr (.) id insertions empty where
insertions =
map (\x -> insert x x) ["test", "slow", "slower", "water"]

View File

@ -1,29 +1,22 @@
module Data.HTTPSEverywhere.Rules (
RuleSet,
getRulesets,
RuleTrie,
rewriteURL,
rewriteCookie
) where
import Prelude hiding (null, head)
import Data.Bool (bool)
import Data.Functor.Infix ((<$$>), (<$$$>))
import Network.HTTP.Client (Cookie)
import Network.URI (URI)
import Pipes ((>->))
import Pipes.Prelude (head, null)
import Control.Monad.Identity (Identity(runIdentity))
import Data.HTTPSEverywhere.Rules.Internal (getRulesets, getRulesetsMatching, havingRulesThatTrigger, havingCookieRulesThatTrigger, setSecureFlag, RuleSet)
import Data.Monoid (First(..), Any(..))
import Data.Bool (bool)
import Network.HTTP.Client (Cookie(..))
rewriteURL' :: Monad m => [RuleSet] -> URI -> m (Maybe URI)
rewriteURL' rs url = head $ getRulesetsMatching rs url >-> havingRulesThatTrigger url
import Data.HTTPSEverywhere.Rules.Internal
rewriteURL :: [RuleSet] -> (URI -> Maybe URI)
rewriteURL = runIdentity <$$> rewriteURL'
rewriteURL :: URI -> Maybe URI
rewriteURL uri = getFirst . mconcat $
map (First . havingRulesThatTrigger uri) (getRulesetsMatching uri)
rewriteCookie' :: Monad m => [RuleSet] -> URI -> Cookie -> m (Maybe Cookie)
rewriteCookie' rs url cookie = Just (setSecureFlag cookie) `bool` Nothing <$> null producer
where producer = getRulesetsMatching rs url >-> havingCookieRulesThatTrigger cookie
rewriteCookie :: [RuleSet] -> URI -> (Cookie -> Maybe Cookie)
rewriteCookie = runIdentity <$$$> rewriteCookie'
rewriteCookie :: URI -> (Cookie -> Maybe Cookie)
rewriteCookie uri cookie =
bool Nothing (Just $ setSecureFlag cookie) . getAny . mconcat $
map (Any . havingCookieRulesThatTrigger cookie)
(getRulesetsMatching uri)

View File

@ -1,71 +1,49 @@
{-# LANGUAGE CPP #-}
{-# LANGUAGE TemplateHaskell #-}
{-# LANGUAGE RecordWildCards #-}
module Data.HTTPSEverywhere.Rules.Internal (
getRulesets,
RuleTrie,
getRulesetsMatching,
havingRulesThatTrigger,
havingCookieRulesThatTrigger,
setSecureFlag,
RuleSet,
#ifdef TEST
hasTargetMatching,
hasTriggeringRuleOn,
hasExclusionMatching
#endif
setSecureFlag
) where
import Prelude hiding (readFile, filter)
import Control.Applicative ((<*>), (<$>))
import Control.Lens ((&))
import Control.Monad ((<=<), join)
import Data.Bool (bool)
import Data.Functor.Infix ((<$$>))
import Data.List (find)
import Data.Maybe (isJust)
import Control.Monad (MonadPlus(..))
import Data.Text (pack)
import Data.Maybe (maybeToList)
import Data.Monoid (First(..))
import Network.HTTP.Client (Cookie(..))
import Network.URI (URI)
import Pipes (Pipe, Producer, for, each, await, yield, lift, (>->))
import Pipes.Prelude (filter, toListM)
import Network.URI (URI, uriAuthority, uriRegName)
import Data.HTTPSEverywhere.Rules.Internal.Parser (parseRuleSets)
import qualified Data.HTTPSEverywhere.Rules.Internal.Raw as Raw (getRule, getRules)
import Data.HTTPSEverywhere.Rules.Internal.Types (RuleSet(..), Target(..), Exclusion(..), Rule(..), CookieRule(..))
import Data.HTTPSEverywhere.Rules.Internal.Trie (RuleTrie)
import qualified Data.HTTPSEverywhere.Rules.Internal.Trie as Trie
import Data.HTTPSEverywhere.Rules.Internal.Types
(RuleSet(..), getRule, getExclusion, getCookieRule)
import qualified Data.HTTPSEverywhere.Rules.Internal.Trie.Supercompilation
as Supercompilation (trie)
getRulesets' :: Producer RuleSet IO ()
getRulesets' = lift Raw.getRules
>>= flip (for . each) (flip (for . each) yield <=< lift . (parseRuleSets <$$> Raw.getRule))
trie :: RuleTrie
trie = $(Supercompilation.trie)
getRulesets :: IO [RuleSet]
getRulesets = toListM getRulesets'
unless :: MonadPlus m => Bool -> m a -> m a
unless True action = action
unless False _ = mzero
getRulesetsMatching :: Monad m => [RuleSet] -> URI -> Producer RuleSet m ()
getRulesetsMatching rs url = each rs
>-> filter (flip hasTargetMatching url)
>-> filter (not . flip hasExclusionMatching url)
getRulesetsMatching :: URI -> [RuleSet]
getRulesetsMatching uri = do
let hostname = fmap (pack . uriRegName) (uriAuthority uri)
r@RuleSet{..} <- maybeToList hostname >>= Trie.lookup trie
unless (any (\e -> getExclusion e uri) ruleSetExclusions) $
return r
havingRulesThatTrigger :: Monad m => URI -> Pipe RuleSet URI m ()
havingRulesThatTrigger url = flip hasTriggeringRuleOn url <$> await
>>= maybe (havingRulesThatTrigger url) yield
havingRulesThatTrigger :: URI -> RuleSet -> Maybe URI
havingRulesThatTrigger uri RuleSet{..} =
getFirst . mconcat $ map (First . ($ uri) . getRule) ruleSetRules
havingCookieRulesThatTrigger :: Monad m => Cookie -> Pipe RuleSet Bool m ()
havingCookieRulesThatTrigger cookie = flip hasTriggeringCookieRuleOn cookie <$> await
>>= bool (havingCookieRulesThatTrigger cookie) (yield True)
hasTargetMatching :: RuleSet -> URI -> Bool
hasTargetMatching ruleset url = getTargets ruleset <*> [url] & or
where getTargets = getTarget <$$> ruleSetTargets
hasExclusionMatching :: RuleSet -> URI -> Bool
hasExclusionMatching ruleset url = getExclusions ruleset <*> [url] & or
where getExclusions = getExclusion <$$> ruleSetExclusions
hasTriggeringRuleOn :: RuleSet -> URI -> Maybe URI -- Nothing ~ False
hasTriggeringRuleOn ruleset url = getRules ruleset <*> [url] & find isJust & join
where getRules = getRule <$$> ruleSetRules
hasTriggeringCookieRuleOn :: RuleSet -> Cookie -> Bool
hasTriggeringCookieRuleOn ruleset cookie = getCookieRules ruleset <*> [cookie] & or
where getCookieRules = getCookieRule <$$> ruleSetCookieRules
havingCookieRulesThatTrigger :: Cookie -> RuleSet -> Bool
havingCookieRulesThatTrigger cookie RuleSet{..} =
any (($ cookie) . getCookieRule) ruleSetCookieRules
setSecureFlag :: Cookie -> Cookie
setSecureFlag cookie = cookie { cookie_secure_only = True }

View File

@ -1,74 +1,61 @@
{-# LANGUAGE CPP #-}
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE RecordWildCards #-}
{-# LANGUAGE LambdaCase #-}
{-# LANGUAGE OverloadedStrings #-}
{-# OPTIONS_GHC -fno-warn-name-shadowing #-}
module Data.HTTPSEverywhere.Rules.Internal.Parser (
parseRuleSets,
#ifdef TEST
parseTarget
#endif
parseSerialisable
) where
import Prelude hiding (take, drop)
import Control.Applicative ((<$>))
import Control.Lens (toListOf, only, to, (^..), (^.), (&), (<&>), _Just)
import Control.Monad (join)
import Control.Monad.State (State, evalState, modify', gets)
import Data.Bifunctor (bimap)
import Data.Bool (bool)
import Data.Functor.Infix ((<$$>))
import Data.Maybe (catMaybes)
import Data.String.Conversions (cs)
import qualified Data.Text as Strict (Text)
import Data.Text (dropEnd, dropWhileEnd, drop, isSuffixOf, takeEnd, take)
import qualified Data.Text.Lazy as Lazy (Text)
import Network.HTTP.Client (Cookie(..))
import Network.URI (URI(uriAuthority), URIAuth(uriRegName), parseURI)
import Text.Taggy.Lens (html, allNamed, attr, Element)
import Data.HTTPSEverywhere.Rules.Internal.Types (RuleSet(..), Target(..), Rule(..), Exclusion(..), CookieRule(..))
import Data.Text.ICU.Extras (match, findAndReplace)
import Data.HTTPSEverywhere.Rules.Internal.Types
(RuleSet(..), fromSerialisable)
import qualified
Data.HTTPSEverywhere.Rules.Internal.Types.Serialisable as Serialisable
parseRule :: Element -> Maybe Serialisable.Rule
parseRule element = Serialisable.Rule
<$> element ^. attr "from"
<*> element ^. attr "to"
parseCookieRule :: Element -> Maybe Serialisable.CookieRule
parseCookieRule element = Serialisable.CookieRule
<$> element ^. attr "host"
<*> element ^. attr "name"
parseSerialisable' :: Element -> Maybe Serialisable.RuleSet
parseSerialisable' txt = txt ^. attr "name" <&> \ruleSetName -> do
let
ruleSetTargets = txt ^.. allNamed (only "target")
. attr "host"
. _Just
. to Serialisable.Target
ruleSetRules = txt ^.. allNamed (only "rule")
. to parseRule
& catMaybes
ruleSetExclusions = txt ^.. allNamed (only "exclusion")
. attr "pattern"
. _Just
. to Serialisable.Exclusion
ruleSetCookieRules = txt ^.. allNamed (only "securecookie")
. to parseCookieRule
& catMaybes
Serialisable.RuleSet
ruleSetName
ruleSetTargets
ruleSetRules
ruleSetExclusions
ruleSetCookieRules
parseSerialisable :: Lazy.Text -> [Serialisable.RuleSet]
parseSerialisable = fmap catMaybes `fmap` toListOf $
html . allNamed (only "ruleset") . to parseSerialisable'
parseRuleSets :: Lazy.Text -> [RuleSet]
parseRuleSets = catMaybes <$$> toListOf $ html . allNamed (only "ruleset") . to parseRuleSet
parseRuleSets = fmap fromSerialisable . parseSerialisable
parseRuleSet :: Element -> Maybe RuleSet
parseRuleSet xml = xml ^. attr "name" <&> \ruleSetName -> do
let ruleSetTargets = xml ^.. allNamed (only "target") . attr "host" . _Just . to parseTarget
ruleSetRules = xml ^.. allNamed (only "rule") . to parseRule & catMaybes
ruleSetExclusions = xml ^.. allNamed (only "exclusion") . attr "pattern" . _Just . to parseExclusion & catMaybes
ruleSetCookieRules = xml ^.. allNamed (only "securecookie") . to parseCookieRule & catMaybes
RuleSet ruleSetName ruleSetTargets ruleSetRules ruleSetExclusions ruleSetCookieRules
parseTarget' :: State (Strict.Text, Strict.Text) Bool
parseTarget' = do
gets ((== ".*") . takeEnd 2 . fst) >>= bool (return ())
(modify' (bimap (dropEnd 2) (dropEnd 1 . dropWhileEnd (/= '.'))))
gets (take 2 . fst) >>= \case
"*." -> do
suffix <- gets (drop 1 . fst) -- Don't drop the dot, we expect some prefix.
isSuffixOf suffix <$> gets snd
_ -> gets (uncurry (==))
parseTarget :: Strict.Text -> Target
parseTarget target = Target $ \uri -> do
case uriRegName <$> uriAuthority uri of
Just domain -> evalState parseTarget' (target, cs domain)
Nothing -> False
parseRule :: Element -> Maybe Rule
parseRule element = do
pattern <- element ^. attr "from"
replacement <- element ^. attr "to"
substitute <- findAndReplace pattern replacement
return . Rule $ join . fmap (parseURI . cs) . substitute . cs . show
parseExclusion :: Strict.Text -> Maybe Exclusion
parseExclusion = Exclusion . (. cs . show) <$$> match
parseCookieRule :: Element -> Maybe CookieRule
parseCookieRule element = CookieRule <$> do
hostMatches <- element ^. attr "host" . to (>>= match)
nameMatches <- element ^. attr "name" . to (>>= match)
return $ \Cookie{..} -> nameMatches (cs cookie_name) && hostMatches (cs cookie_domain)

View File

@ -4,7 +4,6 @@ module Data.HTTPSEverywhere.Rules.Internal.Raw (
) where
import Prelude hiding (readFile)
import Data.Functor.Infix ((<$$>))
import Data.Text.Lazy (Text)
import Data.Text.Lazy.IO (readFile)
import System.Directory (getDirectoryContents)
@ -16,8 +15,8 @@ getRule :: FilePath -> IO Text
getRule = readFile
getRules :: IO [FilePath]
getRules = getDataFileName [] >>= filter isXML <$$> getAbsoluteDirectoryContents
getRules = getDataFileName [] >>= fmap (filter isXML) <$> getAbsoluteDirectoryContents
where isXML = (== ".xml") . takeExtension
getAbsoluteDirectoryContents :: FilePath -> IO [FilePath]
getAbsoluteDirectoryContents x = combine x <$$> getDirectoryContents x
getAbsoluteDirectoryContents x = fmap (combine x) <$> getDirectoryContents x

View File

@ -0,0 +1,89 @@
{-# LANGUAGE DeriveFunctor #-}
{-# LANGUAGE TupleSections #-}
{-# LANGUAGE RecordWildCards #-}
{-# LANGUAGE ScopedTypeVariables #-}
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE DeriveLift #-}
module Data.HTTPSEverywhere.Rules.Internal.Trie (
RuleTrie,
RuleSet,
empty,
insert,
lookup,
enumerate
) where
import Prelude hiding (lookup, null)
import Data.Maybe (fromMaybe, maybeToList)
import Data.Text (Text, null, split, snoc, append)
import Language.Haskell.TH.Syntax (Lift)
import Data.Text.RadixTrie (RadixTrie)
import qualified Data.Text.RadixTrie as RadixTrie
import qualified
Data.HTTPSEverywhere.Rules.Internal.Types as Types
data Trie a = Trie
{ rule :: Maybe a -- set if terminal
, wildcard :: Maybe a
, children :: RadixTrie (Trie a)
} deriving (Show, Functor, Lift)
data RuleSet a = RuleSet { wild :: Trie a, tame :: Trie a }
deriving (Show, Functor, Lift)
type RuleTrie = RuleSet Types.RuleSet
snoc' :: Text -> Char -> Text
snoc' txt c = if null txt then txt else snoc txt c
enumerate' :: Trie a -> [(Text, a)]
enumerate' (Trie r w c) =
maybe [] (\x -> [("", x)]) r ++
maybe [] (\x -> [("*", x)]) w ++
concatMap (\(x, y) -> map (\(a, b) -> (append (snoc' a '.') x, b)) (enumerate' y)) (RadixTrie.enumerate c)
enumerate :: RuleSet a -> [(Text, a)]
enumerate (RuleSet wild tame) =
(map (\(x, y) -> (append x ".*", y)) (enumerate' wild)) ++
enumerate' tame
emptyTrie :: Trie a
emptyTrie = Trie Nothing Nothing RadixTrie.empty
empty :: RuleSet a
empty = RuleSet emptyTrie emptyTrie
isDot :: Char -> Bool
isDot = (== '.')
insert' :: ([Text], a) -> (Trie a -> Trie a)
insert' ([], a) trie = trie { rule = return a }
insert' (t : ts, a) trie
| t == "*" = trie { wildcard = return a }
| otherwise = trie { children =
RadixTrie.insert t
(insert' (ts, a)
(RadixTrie.lookupDefault emptyTrie
t (children trie)))
(children trie) }
insert :: (Text, a) -> (RuleSet a -> RuleSet a)
insert (s, a) rs =
let parts@(_:_) = reverse (split isDot s) in
if head parts == "*"
then rs { wild = insert' (tail parts, a) (wild rs) }
else rs { tame = insert' (parts, a) (tame rs) }
lookup' :: Trie a -> ([Text] -> [a])
lookup' (Trie rule _ _) [] = maybeToList rule
lookup' (Trie _ wild children) (sub : subsub) =
fromMaybe [] (fmap (flip lookup' subsub)
(sub `RadixTrie.lookup` children)) ++
maybeToList wild
lookup :: RuleSet a -> (Text -> [a])
lookup RuleSet{..} host =
let parts@(_:_) = reverse (split isDot host) in
lookup' tame parts ++ lookup' wild (tail parts)

View File

@ -0,0 +1,41 @@
{-# LANGUAGE TemplateHaskell #-}
module Data.HTTPSEverywhere.Rules.Internal.Trie.Supercompilation (
trie
) where
import Control.Monad ((<=<))
import Pipes (Producer, for, each, yield, lift)
import Pipes.Prelude (fold)
import Language.Haskell.TH
import Data.HTTPSEverywhere.Rules.Internal.Parser (parseSerialisable)
import qualified Data.HTTPSEverywhere.Rules.Internal.Raw as Raw
(getRule, getRules)
import qualified
Data.HTTPSEverywhere.Rules.Internal.Types.Serialisable as Serialisable
import Data.HTTPSEverywhere.Rules.Internal.Trie (RuleSet, empty, insert)
import qualified
Data.HTTPSEverywhere.Rules.Internal.Types as Types
getRulesets :: Producer Serialisable.RuleSet IO ()
getRulesets = lift Raw.getRules >>= (flip (for . each) (flip (for . each) yield <=< lift . (fmap parseSerialisable `fmap` Raw.getRule)))
getTrie :: IO (RuleSet Serialisable.RuleSet)
getTrie = fold
(\tree rule ->
foldr (.) id
(map
(\target -> insert (Serialisable.targetHost target, rule))
(Serialisable.ruleSetTargets rule))
tree)
empty
id
getRulesets
-- getTrie :: IO RuleTrie
-- getTrie = fmap (fmap Types.fromSerialisable) getTrie'
trie :: Q Exp {- :: RuleTrie -}
trie = runIO getTrie >>= \t ->
[| fmap Types.fromSerialisable t |]

View File

@ -1,14 +1,31 @@
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE LambdaCase #-}
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE RecordWildCards #-}
module Data.HTTPSEverywhere.Rules.Internal.Types (
RuleSet(..),
Target(..),
Rule(..),
Exclusion(..),
CookieRule(..)
CookieRule(..),
fromSerialisable
) where
import Data.Text (Text)
import Network.HTTP.Client (Cookie)
import Network.URI (URI)
import Prelude hiding (take, drop)
import Control.Monad (join)
import Control.Monad.State (State, evalState, modify', gets)
import Data.Bool (bool)
import Data.Bifunctor (bimap)
import Data.Maybe (catMaybes)
import Data.Text (Text, take, takeEnd, drop, dropEnd, dropWhileEnd, isSuffixOf, pack, unpack)
import Data.Text.Encoding (decodeUtf8)
import Network.HTTP.Client (Cookie(..))
import Network.URI (URI, uriRegName, uriAuthority, parseURI)
import qualified Data.HTTPSEverywhere.Rules.Internal.Types.Serialisable
as Serialisable
import Data.Text.ICU.Extras (match, findAndReplace)
newtype Rule = Rule { getRule :: URI -> Maybe URI }
newtype Target = Target { getTarget :: URI -> Bool }
@ -22,3 +39,48 @@ data RuleSet = RuleSet
, ruleSetExclusions :: [Exclusion]
, ruleSetCookieRules :: [CookieRule]
}
instance Show RuleSet where
show r = "<RuleSet " ++ unpack (ruleSetName r) ++ ">"
parseTarget' :: State (Text, Text) Bool
parseTarget' = do
gets ((== ".*") . takeEnd 2 . fst) >>= bool (return ())
(modify' (bimap (dropEnd 2) (dropEnd 1 . dropWhileEnd (/= '.'))))
gets (take 2 . fst) >>= \case
"*." -> do
suffix <- gets (drop 1 . fst) -- Don't drop the dot, we expect
-- some prefix.
isSuffixOf suffix <$> gets snd
_ -> gets (uncurry (==))
parseTarget :: Text -> Target
parseTarget target = Target $ \uri -> do
case uriRegName <$> uriAuthority uri of
Just domain -> evalState parseTarget' (target, pack domain)
Nothing -> False
parseExclusion :: Text -> Maybe Exclusion
parseExclusion = fmap (Exclusion . (. pack . show)) `fmap` match
fromSerialisable :: Serialisable.RuleSet -> RuleSet
fromSerialisable Serialisable.RuleSet{..} = RuleSet
{ ruleSetName = ruleSetName
, ruleSetTargets =
map (parseTarget . Serialisable.targetHost) ruleSetTargets
, ruleSetRules = catMaybes . flip map ruleSetRules $
\Serialisable.Rule{..} -> do
substitute <- findAndReplace ruleFrom ruleTo
return . Rule $
join . fmap (parseURI . unpack) . substitute . pack . show
, ruleSetExclusions = catMaybes $ map
(parseExclusion . Serialisable.exclusionPattern)
ruleSetExclusions
, ruleSetCookieRules = catMaybes . flip map ruleSetCookieRules $
\Serialisable.CookieRule{..} -> do
hostMatches <- match cookieRuleHost
nameMatches <- match cookieRuleName
return . CookieRule $ \Cookie{..} ->
nameMatches (decodeUtf8 cookie_name) &&
hostMatches (decodeUtf8 cookie_domain)
}

View File

@ -0,0 +1,30 @@
{-# Language DeriveLift #-}
module Data.HTTPSEverywhere.Rules.Internal.Types.Serialisable
where
import Data.Text (Text)
import Language.Haskell.TH.Syntax (Lift)
import Instances.TH.Lift ()
data Rule = Rule { ruleFrom :: Text, ruleTo :: Text }
deriving (Show, Lift)
data Target = Target { targetHost :: Text }
deriving (Show, Lift)
data Exclusion = Exclusion { exclusionPattern :: Text }
deriving (Show, Lift)
data CookieRule = CookieRule
{ cookieRuleHost :: Text
, cookieRuleName :: Text
} deriving (Show, Lift)
data RuleSet = RuleSet
{ ruleSetName :: Text
, ruleSetTargets :: [Target]
, ruleSetRules :: [Rule]
, ruleSetExclusions :: [Exclusion]
, ruleSetCookieRules :: [CookieRule]
} deriving (Show, Lift)

View File

@ -17,7 +17,6 @@ import Control.Applicative ((<$>), (<*>), (*>), (<|>))
import Control.Error (hush)
import Data.Attoparsec.Text (Parser, char, digit, takeWhile1, string, many', parseOnly)
import Data.Char (digitToInt)
import Data.Functor.Infix ((<$$>), (<&>))
import Data.Maybe (isJust)
import Data.Monoid (Monoid(mappend, mconcat))
import Data.Text (Text)
@ -27,7 +26,7 @@ regex :: Text -> Maybe Regex
regex = hush . regex' []
match :: Text -> Maybe (Text -> Bool)
match = (isJust <$$> find) <$$> regex
match = fmap (fmap isJust <$> find) <$> regex
findAndReplace :: Text -> Text -> Maybe (Text -> Maybe Text)
findAndReplace pattern replacement = (.)
@ -41,7 +40,7 @@ data Segment = Reference Int | Literal Text deriving (Show, Eq)
-- JavaScript admits at most nine capture groups, so this is the correct
-- (and likely intended) interpretation.
parseReference :: Parser Segment
parseReference = char '$' *> digit <&> Reference . digitToInt
parseReference = (Reference . digitToInt) <$> (char '$' *> digit)
parseLiteral :: Parser Segment
parseLiteral = Literal <$> takeWhile1 (/= '$')

View File

@ -0,0 +1,30 @@
module Data.Text.RadixTrie (
RadixTrie,
empty,
lookup,
lookupDefault,
elem,
insert,
enumerate
) where
import Prelude hiding (lookup, elem)
import Data.ByteString.RadixTrie (RadixTrie, empty)
import qualified Data.ByteString.RadixTrie as RT
import qualified Data.Text as Strict (Text)
import Data.Text.Encoding (encodeUtf8, decodeUtf8)
lookup :: Strict.Text -> (RadixTrie a -> Maybe a)
lookup t r = RT.lookup (encodeUtf8 t) r
lookupDefault :: a -> Strict.Text -> (RadixTrie a -> a)
lookupDefault x t r = RT.lookupDefault x (encodeUtf8 t) r
elem :: Strict.Text -> (RadixTrie a -> Bool)
elem t r = RT.elem (encodeUtf8 t) r
insert :: Strict.Text -> a -> (RadixTrie a -> RadixTrie a)
insert t x r = RT.insert (encodeUtf8 t) x r
enumerate :: RadixTrie a -> [(Strict.Text, a)]
enumerate t = map (\(x, y) -> (decodeUtf8 x, y)) (RT.enumerate t)

View File

@ -1,6 +1,8 @@
packages:
- .
extra-deps:
- functor-infix-0.0.5
- http-proxy-0.1.0.5
resolver: lts-8.24
- taggy-0.2.1
- taggy-lens-0.1.2
- http-proxy-0.1.0.6
resolver: lts-11.14
allow-newer: true

View File

@ -1,72 +0,0 @@
{-# LANGUAGE OverloadedStrings #-}
{-# OPTIONS_GHC -fno-warn-orphans #-}
module Data.HTTPSEverywhere.Rules.Internal.ParserSpec (
spec
) where
import Prelude hiding (unlines)
import Data.HTTPSEverywhere.Rules.Internal.Parser (parseTarget, parseRuleSets)
import Data.HTTPSEverywhere.Rules.Internal (hasTargetMatching, hasExclusionMatching, hasTriggeringRuleOn)
import Data.HTTPSEverywhere.Rules.Internal.Types (Target(..), RuleSet(..))
import Data.Maybe (fromJust, isJust)
import Data.Text.Lazy (Text, unlines)
import Network.URI (parseURI)
import Test.Hspec (Spec, describe, it, shouldBe, shouldSatisfy)
spec :: Spec
spec = do
let unsafeParseURL = fromJust . parseURI
describe "parseTarget" $ do
let checkTarget = \target domain -> ($ domain) . getTarget $ parseTarget target
it "*.facebook.com should match s-static.ak.facebook.com" $ do
let url = unsafeParseURL "http://s-static.ak.facebook.com"
checkTarget "*.facebook.com" url `shouldBe` True
it "google.* should match google.nl" $ do
let url = unsafeParseURL "http://google.nl"
checkTarget "google.*" url `shouldBe` True
-- Right-side wildcards apply only one-level deep.
it "google.* should not match google.com.id" $ do
let url = unsafeParseURL "http://google.com.id"
checkTarget "google.*" url `shouldBe` False
-- Wildcards expect there exist some prefix.
it "*.google.* should not match google.nl" $ do
let url = unsafeParseURL "http://google.nl"
checkTarget "*.google.*" url `shouldBe` False
describe "parseRuleSets" $ do
let rulesets = parseRuleSets fixture
it "Should parse a ruleset out of the EFF fixture." $ do
rulesets `shouldSatisfy` not . null
let effRuleSet = head rulesets
it "Should parse out the name of the EFF fixture." $ do
ruleSetName effRuleSet `shouldBe` "EFF"
it "Should parse out two targets from the EFF fixture." $ do
length (ruleSetTargets effRuleSet) `shouldBe` 2
it "Should parse out two rules from the EFF fixture." $ do
length (ruleSetRules effRuleSet) `shouldBe` 2
it "Should parse out one exclusion from the EFF fixture." $ do
length (ruleSetExclusions effRuleSet) `shouldBe` 1
it "A target in the EFF fixture should match eff.org" $ do
let url = unsafeParseURL "http://eff.org/thing"
effRuleSet `hasTargetMatching` url `shouldBe` True
it "A rule in the EFF fixture should trigger on eff.org" $ do
let url = unsafeParseURL "http://eff.org/thing"
effRuleSet `hasTriggeringRuleOn` url `shouldSatisfy` isJust
it "An exclusion in the EFF fixture should trigger on action.eff.org" $ do
let url = unsafeParseURL "http://action.eff.org/thing"
effRuleSet `hasExclusionMatching` url `shouldBe` True
fixture :: Text
fixture = unlines $
[ "<ruleset name=\"EFF\">"
, " <target host=\"eff.org\" />"
, " <target host=\"*.eff.org\" />"
, ""
, " <exclusion pattern=\"^http://action\\.eff\\.org/\"/>"
, " <rule from=\"^http://eff\\.org/\" to=\"https://eff.org/\"/>"
, " <rule from=\"^http://([^/:@]*)\\.eff\\.org/\" to=\"https://$1.eff.org/\"/>"
, "</ruleset>"
]
instance Show RuleSet where
show _ = "<RuleSet>"

View File

@ -1,63 +0,0 @@
{-# LANGUAGE OverloadedStrings #-}
module Data.HTTPSEverywhere.Rules.InternalSpec (
spec
) where
import Prelude hiding (unlines)
import Control.Monad (join)
import Data.Functor ((<$>))
import Data.HTTPSEverywhere.Rules.Internal (hasTargetMatching, hasTriggeringRuleOn, hasExclusionMatching)
import Data.HTTPSEverywhere.Rules.Internal.Types (RuleSet(..), Target(..), Exclusion(..), Rule(..))
import Data.Maybe (fromJust, catMaybes)
import Data.Text (pack, unpack)
import Data.Text.ICU.Extras (match, findAndReplace)
import Network.URI (parseURI)
import Test.Hspec (Spec, describe, it, shouldBe)
spec :: Spec
spec = do
let unsafeParseURL = fromJust . parseURI
describe "hasTargetMatching" $ do
it "Should be True for matching targets." $ do
let url = unsafeParseURL "http://eff.org/thing"
fixture `hasTargetMatching` url `shouldBe` True
it "Should be False for mis-matching targets." $ do
let url = unsafeParseURL "http://gnu.org/thing"
fixture `hasTargetMatching` url `shouldBe` False
describe "hasTriggeringRuleOn" $ do
it "Should be 'Just https://...' for matching rules." $ do
let url = unsafeParseURL "http://eff.org/thing"
fixture `hasTriggeringRuleOn` url `shouldBe` parseURI "https://eff.org/thing"
it "Should be Nothing for mis-matching rules." $ do
let url = unsafeParseURL "http://gnu.org/thing"
fixture `hasTriggeringRuleOn` url `shouldBe` Nothing
describe "hasExclusionMatching" $ do
it "Should be True for matching targets." $ do
let url = unsafeParseURL "http://action.eff.org/thing"
fixture `hasExclusionMatching` url `shouldBe` True
it "Should be False for mis-matching targets." $ do
let url = unsafeParseURL "http://eff.org/thing"
fixture `hasExclusionMatching` url `shouldBe` False
-- We avoid the parser here, for purposes of isolation:
fixture :: RuleSet
fixture = RuleSet
{ ruleSetName = "EFF"
, ruleSetTargets = catMaybes
[ Target . checkURI <$> match "eff\\.org"
, Target . checkURI <$> match ".*\\.eff\\.org"
]
, ruleSetExclusions = catMaybes
[ Exclusion . checkURI <$> match "^http://action\\.eff\\.org/"
]
, ruleSetRules = catMaybes
[ Rule . changeURI <$> findAndReplace "^http://eff\\.org/" "https://eff.org/"
, Rule . changeURI <$> findAndReplace "^http://([^/:@]*)\\.eff\\.org/" "https://$1.eff.org/"
]
, ruleSetCookieRules =
[]
}
where checkURI = (. pack . show)
changeURI f = join . fmap (parseURI . unpack) . checkURI f

View File

@ -5,7 +5,6 @@
module Data.Text.ICU.ExtrasSpec (spec) where
import Control.Applicative ((<$>))
import Control.Lens ((&))
import Data.Text (Text)
import Data.Text.ICU.Extras (match, findAndReplace, Segment(..), parseReplacement)
import Test.Hspec (Spec, describe, it, shouldBe)
@ -16,8 +15,8 @@ spec = do
it "Should be Nothing if provided an invalid regular expression." $ do
match "(" `shouldBe` Nothing
it "Should yield a match function if provided a regular expression." $ do
("xa" &) <$> match "x" `shouldBe` Just True
("xa" &) <$> match "y" `shouldBe` Just False
($ "xa") <$> match "x" `shouldBe` Just True
($ "xa") <$> match "y" `shouldBe` Just False
describe "parseReplacement" $ do
it "Should decompose a replacement string into a sequence [Segment]." $ do
parseReplacement "foo$1bar$4$1" `shouldBe` Just [Literal "foo", Reference 1, Literal "bar", Reference 4, Reference 1]
@ -28,9 +27,9 @@ spec = do
parseReplacement "$10" `shouldBe` Just [Reference 1, Literal "0"]
describe "findAndReplace" $ do
it "Should find and replace based upon a regular expression and pattern." $ do
("barqux" &) <$> findAndReplace "(bar)" "$1baz" `shouldBe` Just (Just "barbazqux")
("barqux" &) <$> findAndReplace "(qux)" "baz$1" `shouldBe` Just (Just "barbazqux")
("barqux" &) <$> findAndReplace "u" "uu" `shouldBe` Just (Just "barquux")
($ "barqux") <$> findAndReplace "(bar)" "$1baz" `shouldBe` Just (Just "barbazqux")
($ "barqux") <$> findAndReplace "(qux)" "baz$1" `shouldBe` Just (Just "barbazqux")
($ "barqux") <$> findAndReplace "u" "uu" `shouldBe` Just (Just "barquux")
instance Show (Text -> Bool) where
show _ = "Text -> Bool"